<< Chapter < Page Chapter >> Page >

As important as understanding the false-alarm, miss and error probabilities of the likelihood ratio test might be, no generalexpression exists for them. In analyzing the Gaussian problem, we find these in terms of Q · , which has no closed form expression. In the general problem, the situation is much worse: No expression of any kindcan be found! The reason is that we don't have the probability distribution of the sufficient statistic in the likelihood ratiotest, which is needed in these expressions . We are faced with the curious situation that while knowing the decision rule thatoptimizes the performance probabilities, we usually don't know what the resulting performance probabilities will be.

Some general expressions are known for the asymptotic form of these error probabilities: limiting expressions as the number ofobservations L becomes large. These results go by the generic name of Stein's Lemma Cover and Thomas, §12.8 .

The attribution to statistician Charles Stein is probably incorrect. HermanChernoff wrote a paper ( Chernoff ) providing a derivation of this result. A reviewer stated thathe thought Stein had derived the result in a technical report, which Chernoff had not seen. Chernoff modified his paper toinclude the reference without checking it. Chernoff's paper provided the link to Stein. However, Stein later denied he hadproved the result; so much for not questioning reviewers! Stein's Lemma should be known as Chernoff's Lemma.
To develop these, we need to define the Kullback-Leibler and Chernoff distances. Respectively,
p 1 p 0 x p 1 x p 1 x p 0 x
𝒞 p 0 p 1 0 s 1 x p 0 x 1 s p 1 x s
These distances are important special cases of the Ali-Silvey distances and have the following properties.
  • p 1 p 0 0 and 𝒞 p 0 p 1 0 . Furthermore, these distances equal zero only when p 0 r p 1 r . The Kullback-Leibler and Chernoff distances are always non-negative, with zero distance occurring only whenthe probability distributions are the same.
  • p 1 p 0 whenever, for some r , p 0 r 0 and p 1 r 0 . If p 1 r 0 , the value of p 1 r p 1 r p 0 r is defined to be zero.
  • When the underlying stochastic quantities are random vectors having statistically independent components withrespect to both p 0 and p 1 , the Kullback-Leibler distance equals the sum of the component distances. Stated mathematically, if p 0 r l p 0 r l and p 1 r l p 1 r l ,
    p 1 r p 0 r l l p 1 r l p 0 r l
    The Chernoff distance does not have this property unless the components are identicallydistributed.
  • p 1 p 0 p 0 p 1 ; 𝒞 p 0 p 1 𝒞 p 1 p 0 . The Kullback-Leibler distance is usually not a symmetric quantity. In some special cases, it can besymmetric (like the just described Gaussian example), but symmetry cannot, and should not, be expected. The Chernoffdistance is always symmetric.
  • p r 1 r 2 p r 1 p r 2 r 1 r 2 . The Kullback-Leibler distance between a joint probability density and the product of the marginaldistributions equals what is known in information theory as the mutual information between the random variables r 1 , r 2 . From the properties of the Kullback-Leibler distance, we see that the mutual information equals zeroonly when the random variables are statistically independent.
These quantities are not actually distances. The Kullback-Leibler distance · · is not symmetric in its arguments and the Chernoff distance 𝒞 · · does not obey the triangle inequality. Nonetheless, the name "distance" is frequently attached to them for reasonsthat will become clear later.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Statistical signal processing. OpenStax CNX. Dec 05, 2011 Download for free at http://cnx.org/content/col11382/1.1
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Statistical signal processing' conversation and receive update notifications?

Ask