13.3 Intensional Approaches to Uncertainty 327
Note that whenPr(B)=0, every number between 0 and 1 is a conditional
probability givenB. The notation for a conditional probability isPr(A|B).
The eventBis the “condition” or “input,” while the eventAis the “conse-
quence” or “output.” However, this terminology does not mean thatBand
Ahave a cause-and-effect relationship.
Conditional probability is the most basic form of inference for probability
theory. If one knows that the eventBhas occurred, then the probability
ofAchanges fromPr(A)toPr(A|B). If the probability ofAdoes not
change, then one says thatAandBareindependentorstatistically independent.
More generally, as one finds out more about what has happened, then the
probability continually changes. Much more elaborate forms of stochastic
inference are developed in chapter 14.
The defining formula for the conditional probability ofAgivenBis
Pr(AandB)=Pr(A|B)Pr(B).
IfPr(B)is nonzero, one can solve for the conditional probability:
Pr(A|B)=Pr(AandB)
Pr(B)
.
This is sometimes used as the definition of the conditional probability. By
reversing the roles ofAandB, the defining formula for the conditional prob-
ability ofBgivenAis
Pr(AandB)=Pr(B|A)Pr(A).
The left-hand side of this equation is the same as the left-hand side of the
defining formula for the conditional probability ofAgivenB. Therefore
Pr(A|B)Pr(B)=Pr(B|A)Pr(A).
IfPr(B)is nonzero, then one can solve forPr(A|B)to obtain
Pr(A|B)=
Pr(B|A)Pr(A)
Pr(B)
.
This is known asBayes’ law. It is named after the English mathematician
Thomas Bayes who proved a special case of it.
In spite of its simplicity, Bayes’ law is powerful. For example, suppose that
Ais a disease andBis a symptom of the disease.Pr(A)is probability of the
disease in the population andPr(B)is the probability of the symptom. If we