Once we have the table of joint probabilities, it is a simple matter to compute
the probabilities needed for the decision tree. The probability of a given test
result—say, Pr(G)—is found by adding across the appropriate row. In algebraic
terms,
[13.2]
so that Pr(G) .3 .2 .5. Note that a good test can occur when the site is
really wet andwhen the site is really dry.
Next, we calculate revised probabilities. The chance that the site is wet
given a good seismic test is computed as
[13.3]
so that Pr(WƒG) .3/.5 .6. Similarly, we have Pr(WƒB) Pr(W&B)/Pr(B)
.1/.5 .2. Of course, these are precisely the answers we found earlier from the
joint probability table. But in this case, the partners did not begin with the table
in front of them; rather, they started with a prior probability, Pr(W), and with
information on the accuracy of the test, Pr(GƒW) and Pr(BƒD). From these facts,
they were able to calculate the necessary probabilities: Pr(G), Pr(WƒG), and
Pr(WƒB).
Pr(WƒG)
Pr(W&G)
Pr(G)
,
Pr(G)Pr(W&G)Pr(D&G),
548 Chapter 13 The Value of Information
CHECK
STATION 1
Suppose the partners face the same seismic test just discussed but are less optimistic
about the site; the prior probability now is Pr(W) .28. Construct the joint probability
table, and compute Pr(WƒƒG) and Pr(WƒƒB).
Bayes’ Theorem
With a little practice, the step-by-step mechanics of calculating revised proba-
bilities become routine. In fact, the sequence of steps can be condensed. For
example, if we replace Pr(W&G) in Equation 13.3 with the right-hand side of
Equation 13.1, we obtain
[13.4]
This equation is the most common form of Bayes’ theorem(named after
Reverend Thomas Bayes, who wrote an essay on the subject in 1763). Bayes’ the-
Pr(WƒG)c
Pr(GƒW)
Pr(G)
d[Pr(W)]
c13TheValueofInformation.qxd 9/26/11 11:02 AM Page 548