Introduction to Probability and Statistics for Engineers and Scientists

(Sean Pound) #1

*7.8The Bayes Estimator 273


continuous case — that a data value is equal toxwhenθis the value of the parameter.
If the observed data values areXi=xi,i=1,...,n, then the updated, or conditional,
probability density function ofθis as follows:


f(θ|x 1 ,...,xn)=

f(θ,x 1 ,...,xn)
f(x 1 ,...,xn)

=
∫ p(θ)f(x^1 ,...,xn|θ)
f(x 1 ,...,xn|θ)p(θ)dθ

The conditional density functionf(θ|x 1 ,...,xn) is called theposteriordensity function.
(Thus, before observing the data, one’s feelings aboutθare expressed in terms of the
prior distribution, whereas once the data are observed, this prior distribution is updated
to yield the posterior distribution.)
Now we have shown that whenever we are given the probability distribution of a random
variable, the best estimate of the value of that random variable, in the sense of minimizing
the expected squared error, is its mean. Therefore, it follows that the best estimate of
θ, given the data valuesXi=xi,i=1,...,n, is the mean of the posterior distribution
f(θ|x 1 ,...,xn). This estimator, called theBayes estimator, is written asE[θ|X 1 ,...,Xn].
That is, ifXi=xi,i=1,...,n, then the value of the Bayes estimator is


E[θ|X 1 =x 1 ,...,Xn=xn]=


θf(θ|x 1 ,...,xn)dθ

EXAMPLE 7.8a Suppose thatX 1 ,...,Xnare independent Bernoulli random variables, each
having probability mass function given by


f(x|θ)=θx(1−θ)^1 −x, x=0, 1

whereθis unknown. Further, suppose thatθis chosen from a uniform distribution on
(0, 1). Compute the Bayes estimator ofθ.


SOLUTION We must computeE[θ|X 1 ,...,Xn]. Since the prior density ofθis the uniform
density


p(θ)=1, 0 <θ < 1

we have that the conditional density ofθgivenX 1 ,...,Xnis given by


f(θ|x 1 ,...,xn)=

f(x 1 ,...,xn,θ)
f(x 1 ,...,xn)

=

f(x 1 ,...,xn|θ)p(θ)
∫ 1
0 f(x^1 ,...,xn|θ)p(θ)dθ

=

θ

n 1 xi
(1−θ)n−
1 nxi
∫ 1
0 θ

n 1 xi(1−θ)n−n 1 xidθ
Free download pdf