Introduction to Probability and Statistics for Engineers and Scientists

(Sean Pound) #1

*7.8The Bayes Estimator 275


where


f(x 1 ,...,xn|θ)=

1
(2π)n/2σ 0 n

exp

{

∑n

i= 1

(xi−θ)^2 /2σ 02

}

p(θ)=

1

2 πσ

exp{−(θ−μ)^2 /2σ^2 }

and


f(x 1 ,...,xn)=

∫∞

−∞

f(x 1 ,...,xn|θ)p(θ)dθ

With the help of a little algebra, it can now be shown that this conditional density is a
normaldensity with mean


E[θ|X 1 ,...,Xn]=

nσ^2
nσ^2 +σ 02

X+

σ 02
nσ^2 +σ 02

μ (7.8.3)

=

n
σ 02
n
σ 02

+

1
σ^2

X+

1
σ^2
n
σ 02

+

1
σ^2

μ

and variance


Var(θ|X 1 ,...,Xn)=

σ 02 σ^2
nσ^2 +σ 02

Writing the Bayes estimator as we did in Equation 7.8.3 is informative, for it shows that it
is a weighted average ofX, the sample mean, andμ, thea priorimean. In fact, the weights
given to these two quantities are in proportion to the inverses ofσ 02 /n(the conditional
variance of the sample meanXgivenθ) andσ^2 (the variance of the prior distribution). ■


REMARK: ON CHOOSING A NORMAL PRIOR


As illustrated by Example 7.8b, it is computationally very convenient to choose a normal
prior for the unknown meanθof a normal distribution — for then the Bayes estimator
is simply given by Equation 7.8.3. This raises the question of how one should go about
determining whether there is a normal prior that reasonably represents one’s prior feelings
about the unknown mean.
To begin, it seems reasonable to determine the value — call itμ— that youa priori
feel is most likely to be nearθ. That is, we start with the mode (which equals the mean
when the distribution is normal) of the prior distribution. We should then try to ascertain
whether or not we believe that the prior distribution is symmetric aboutμ. That is, for
eacha>0 do we believe that it is just as likely thatθwill lie betweenμ−aandμas it is

Free download pdf