If H is the event a hypothetical condition exists and E is the event the
evidence occurs, the probabilities available are usually P(H)(or an odds value), P(E|H), andP(E|Hc). What is desired is P(H|E). We simply use Bayes' rule to reverse the direction of conditioning. No conditional independence is involved. Suppose there are two “independent” bits of evidence. Obtaining this evidence may be “operationally” independent, but if the items both relate to the hypothesized condition, then they cannot be really independent. The condition assumed is usually that of conditional independence, given H, and similarly, given Hc. Several cases representative of practical problems are considered. These ideas are applied to a classification problem. A population consists of members of two subgroups. It is desired to formulate a battery of questions to aid in identifying the subclass membership of randomly selected individuals in the population. The questions are designed so that for each individual the answers are independent, in the sense that the answers to any subset of these questions are not affected by and do not affect the answers to any other subset of the questions. The answers are, however, affected by the subgroup membership. Thus, our treatment of conditional independence suggests that it is reasonable to suppose the answers are conditionally independent, given the subgroup membership. These results are used to determine which subclass is more likely.
Some patterns of probable inference
We are concerned with the likelihood of some hypothesized condition. In general,
we have evidence for the condition which can never be absolutely certain. We areforced to assess probabilities (likelihoods) on the basis of the evidence. Some
typical examples:
HYPOTHESIS |
EVIDENCE |
Job success |
Personal traits |
Presence of oil |
Geological structures |
Operation of a device |
Physical condition |
Market condition |
Test market condition |
Presence of a disease |
Tests for symptoms |
If
H is the event the hypothetical condition exists and
E is the event the
evidence occurs, the probabilities available are usually
(or an odds value),
, and
. What is desired is
or, equivalently, the
odds
. We simply use Bayes' rule to reverse the direction
of conditioning.
No conditional independence is involved in this case.
Independent evidence for the hypothesized condition
Suppose there are two “independent” bits of evidence. Now obtaining this evidence
may be “operationally” independent, but if the items both relate to thehypothesized condition, then they cannot be really independent. The condition
assumed is usually of the form
—if
H occurs, then
knowledge of
E
2 does not affect the likelihood of
E
1 . Similarly, we
usually have
. Thus
and
.
Independent medical tests
Suppose a doctor thinks the odds are 2/1 that a patient has a certain disease. She
orders two independent tests. Let
H be the event the patient has the disease and
E
1 and
E
2 be the events the tests are positive. Suppose the first test
has probability 0.1 of a false positive and probability 0.05 of a false negative. Thesecond test has probabilities 0.05 and 0.08 of false positive and false negative,
respectively. If both tests are positive, what is the posterior probability thepatient has the disease?
Solution
Assuming
and
, we work first in terms of the odds,
then convert to probability.
The data are
Substituting values, we get
Got questions? Get instant answers now!