<< Chapter < Page Chapter >> Page >
This is an introduction to Detection Theory. This module gives a brief overview of the problems associated with signal transfer--specifically, the effects that noise produces in a signal during transmission.

Introduction

The intent of detection theory is to provide rational (instead of arbitrary) techniques for determiningwhich of several conceptions--models--of data generation and measurement is most "consistent" with a given set of data. Indigital communication, the received signal must be processed to determine whether it represents a binary "0" or "1"; inradar or sonar, the presence or absence of a target must be determined from measurements of propagating fields; in seismicproblems, the presence of oil deposits must be inferred from measurements of sound propagation in the earth. Usingdetection theory, we will derive signal processing algorithms which will give good answers to questions such as these whenthe information-bearing signals are corrupted by superfluous signals (noise).

The detection theory's foundation rests on statistical hypothesis testing ( Cramr, 1946, Chapter 35 ; Lehman, 1986 ; Poor, 1988, Chapter 2 ; van Trees, 1968, pp 19-52 ). Given a probabilistic model (an event space and the associated probabilistic structures), a random vector r expressing the observed data, and a listing of the probabilistic models--the hypotheses --which may have generated r , we want a systematic, optimal method of determining which modelcorresponds to the data. In the simple case where only two models-- 0 and 1 --are possible, we ask, for each set of observations, what is the "best" method of deciding whether 0 or 1 was true? We have many ways of mathematically stating what "best" means: we shall initially choose theaverage cost of each decision as our criterion for correctness. This seemingly arbitrary choice of criterionwill be shown later not to impose rigid constraints on the algorithms that solve the hypothesistesting problem. Over a variety of reasonable criteria, one central solution to evaluating which model describesobservations--the likelihood ratio test--will persistently emerge; this result will form the basis of all detection algorithms.

Detection problems become more elaborate and complicated when models become vague. Models are characterized by probability distributions,and these distributions suffice in the likelihood ratio test. Vagueness does not refer to this stochastic framework; rather, itrefers to uncertainties in the probability distribution itself. The distribution may depend on unknown parameters, like noise power level.The distribution most certainly depends on signal structure; suppose that is partially or completely unknown? The most difficult (andinteresting) problems emerge when uncertainties arise in the probability distributions themselves. For example, suppose the onlymodel information we have is through data; how would an optimal detector be derived then?

Along the way we will discover that a general geometric picture of detection emerges: Ease of a detection problem depends on how "farapart" the models are from each other. This geometric framework turns out to be elaborate, but underlies modern detection theoryand forms links to information theory.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Signal and information processing for sonar. OpenStax CNX. Dec 04, 2007 Download for free at http://cnx.org/content/col10422/1.5
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Signal and information processing for sonar' conversation and receive update notifications?

Ask