-
Home
- Signal and information processing
- Detection theory
- Random parameters
When we know the density of
, the likelihood function can be expressed as
and the likelihood ratio in the random parameter case becomes
Unfortunately, there are many examples where either the
integrals involved are intractable or the sufficient statisticis virtually the same as the likelihood ratio, which can be
difficult to compute.
A simple, but interesting, example that results in a
computable answer occurs when the mean of Gaussian randomvariables is either zero (model 0) or is
with equal probability (hypothesis 1). The second hypothesis
means that a non-zero mean is present, but its sign is notknown. We are therefore stating that if hypothesis one is in
fact valid, the mean has fixed sign for each observation -what is random is its
a priori value. As
before,
statistically independent observations are made.
The numerator of the likelihood ratio is the sum of two
Gaussian densities weighted by
(the
a priori probability values), one having a
positive mean, the other negative. The likelihood ratio,after simple cancellation of common terms, becomes
and the decision rule takes the form
where
is the
hyperbolic cosine given simply as
. As this quantity is an even function, the sign of
its argument has no effect on the result. The decision rulecan be written more simply as
The sufficient statistic equals the
magnitude of the sum of the observations
in this case. While the right side of this expression, whichequals
, is complicated, it need only
be computed once. Calculation of the performanceprobabilities can be complicated; in this case, the
false-alarm probability is easy to find and the others moredifficult.
Source:
OpenStax, Signal and information processing for sonar. OpenStax CNX. Dec 04, 2007 Download for free at http://cnx.org/content/col10422/1.5
Google Play and the Google Play logo are trademarks of Google Inc.