Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1

Chapter 11


Bayesian Statistics


11.1BayesianProcedures


To understand the Bayesian inference, let us review Bayes Theorem, (1.4.3), in a
situation in which we are trying to determine something about a parameter of a
distribution. Suppose we have a Poisson distribution with parameterθ>0, and we
know that the parameter is equal to eitherθ=2orθ= 3. In Bayesian inference, the
parameter is treated as a random variable Θ. Suppose, for this example, we assign
subjectivepriorprobabilities ofP(Θ = 2) =^13 andP(Θ=3)=^23 to the two
possible values. These subjective probabilities are based upon past experiences,
and it might be unrealistic that Θ can only take one of two values, instead of a
continuousθ>0 (we address this immediately after this introductory illustration).
Now suppose a random sample of sizen= 2 results in the observationsx 1 =2,
x 2 = 4. Given these data, what are theposteriorprobabilities of Θ = 2 and
Θ = 3? By Bayes Theorem, we have


P(Θ = 2|X 1 =2,X 2 =4)

=
P(Θ = 2 andX 1 =2,X 2 =4)
P(X 1 =2,X 2 =4|Θ=2)P(Θ = 2) +P(X 1 =2,X 2 =4|Θ=3)P(Θ = 3)

=

(^13 )e

− 222
2!

e−^224
4!
(^13 )e
− 222
2!

e−^224
4! +(

2
3 )

e−^332
2!

e−^334
4!

=0. 245.

Similarly,


P(Θ = 3|X 1 =2,X 2 =4)=1− 0 .245 = 0. 755.

That is, with the observationsx 1 =2,x 2 = 4, the posterior probability of Θ = 2
was smaller than the prior probability of Θ = 2. Similarly, the posterior probability
of Θ = 3 was greater than the corresponding prior. That is, the observations
x 1 =2,x 2 = 4 seemed to favor Θ = 3 more than Θ = 2; and that seems to agree
with our intuition asx= 3. Now let us address in general a more realistic situation
in which we place a prior pdfh(θ) on a support that is a continuum.


655
Free download pdf