As important as understanding the false-alarm, miss and error
probabilities of the likelihood ratio test might be, no generalexpression exists for them. In analyzing the Gaussian problem,
we find these in terms of
, which has no closed form expression. In the general
problem, the situation is much worse: No expression of any kindcan be found! The reason is that we don't have the probability
distribution of the sufficient statistic in the likelihood ratiotest, which is needed in
these expressions . We are faced with
the curious situation that while knowing the decision rule thatoptimizes the performance probabilities, we usually don't know
what the resulting performance probabilities will be.
Some general expressions are known for the
asymptotic form of these error
probabilities: limiting expressions as the number ofobservations
becomes large. These results go by the generic name
of
Stein's Lemma
Cover and
Thomas, §12.8 .
The attribution
to statistician Charles Stein is probably incorrect. HermanChernoff wrote a paper (
Chernoff )
providing a derivation of this result. A reviewer stated thathe thought Stein had derived the result in a technical report,
which Chernoff had not seen. Chernoff modified his paper toinclude the reference without checking it. Chernoff's paper
provided the link to Stein. However, Stein later denied he hadproved the result; so much for not questioning reviewers!
Stein's Lemma should be known as Chernoff's Lemma.
To
develop these, we need to define the
Kullback-Leibler and
Chernoff distances. Respectively,
These distances are important special cases of the
Ali-Silvey distances and have the
following properties.
-
and
. Furthermore, these distances equal zero only when
. The Kullback-Leibler and Chernoff distances are
always non-negative, with zero distance occurring only whenthe probability distributions are the same.
-
whenever, for some
,
and
. If
, the value of
is defined to be zero.
- When the underlying stochastic quantities are random
vectors having statistically independent components withrespect to both
and
, the Kullback-Leibler distance equals the sum of
the component distances. Stated mathematically, if
and
,
The Chernoff distance does
not have
this property unless the components are identicallydistributed.
-
;
. The Kullback-Leibler distance is usually not a
symmetric quantity. In some special cases, it can besymmetric (like the just described Gaussian example), but
symmetry cannot, and should not, be expected. The Chernoffdistance is always symmetric.
-
. The Kullback-Leibler distance between a joint
probability density and the product of the marginaldistributions equals what is known in information theory as
the
mutual information between the random
variables
,
. From the properties of the Kullback-Leibler
distance, we see that the mutual information equals zeroonly when the random variables are statistically
independent.
These quantities are not actually distances. The
Kullback-Leibler distance
is not symmetric in its arguments and the Chernoff
distance
does not obey the triangle inequality. Nonetheless,
the name "distance" is frequently attached to them for reasonsthat will become clear later.