Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
684 Bayesian Statistics

We can use our solution of this last example to obtain the empirical Bayes
estimate for Example 11.4.2 also, for in this earlier example, the sample size is 1.
Thus, the empirical Bayes estimate forλisx. In particular, for the numerical case
given at the end of Example 11.4.2, the empirical Bayes estimate has the value 6.


EXERCISES


11.4.1.Consider the Bayes model


Xi|θ ∼ iid Γ

(
1 ,

1
θ

)

Θ|β ∼ Γ(2,β).

By performing the following steps, obtain the empirical Bayes estimate ofθ.


(a)Obtain the likelihood function

m(x|β)=

∫∞

0

f(x|θ)h(θ|β)dθ.

(b)Obtain the mleβ̂ofβfor the likelihoodm(x|β).

(c)Show that the posterior distribution of Θ givenxandβ̂is a gamma distribu-
tion.

(d)Assuming squared-error loss, obtain the empirical Bayes estimator.

11.4.2.Consider the hierarchical Bayes model


Y ∼ b(n, p), 0 <p< 1
p|θ ∼ h(p|θ)=θpθ−^1 ,θ> 0
θ ∼ Γ(1,a),a>0 is specified. (11.4.17)

(a)Assuming squared-error loss, write the Bayes estimate ofpas in expression
(11.4.3). Integrate relative toθfirst. Show that both the numerator and
denominator are expectations of a beta distribution with parametersy+1
andn−y+1.

(b)Recall the discussion around expression (11.3.2). Write an explicit Monte
Carlo algorithm to obtain the Bayes estimate in part (a).

11.4.3.Reconsider the hierarchical Bayes model (11.4.17) of Exercise 11.4.2.

(a)Show that the conditional pdfg(p|y, θ) is the pdf of a beta distribution with
parametersy+θandn−y+1.

(b)Show that the conditional pdfg(θ|y, p) is the pdf of a gamma distribution
with parameters 2 and

[ 1
a−logp

]− 1
.
Free download pdf