Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
11.4. Modern Bayesian Methods 685

(c)Using parts (a) and (b) and assuming squared-error loss, write the Gibbs
sampler algorithm to obtain the Bayes estimator ofp.

11.4.4.For the hierarchical Bayes model of Exercise 11.4.2, setn=50anda=2.
Now, draw aθat random from a Γ(1,2) distribution and label itθ∗.Next,drawa
pat random from the distribution with pdfθ∗pθ


∗− 1
and label itp∗. Finally, draw
ayat random from ab(n, p∗) distribution.

(a)Settingmat 3000, obtain an estimate ofθ∗using your Monte Carlo algorithm
of Exercise 11.4.2.

(b)Settingmat 3000 andn∗at 6000, obtain an estimate ofθ∗using your Gibbs
sampler algorithm of Exercise 11.4.3. Letp 3001 ,p 3002 ,...,p 6000 denote the
stream of values drawn. Recall that these values are (asymptotically) simu-
lated values from the posterior pdfg(p|y). Use this stream of values to obtain
a 95% credible interval.

11.4.5.Write the Bayes model of Exercise 11.4.2 as


Y ∼ b(n, p), 0 <p< 1
p|θ ∼ h(p|θ)=θpθ−^1 ,θ> 0.

Set up the estimating equations for the mle ofg(y|θ), i.e., the first step to obtain
the empirical Bayes estimator ofp. Simplify as much as possible.

11.4.6.Example 11.4.1 dealt with a hierarchical Bayes model for a conjugate family
of normal distributions. Express that model as

X|Θ ∼ N

(
θ,

σ^2
n

)
,σ^2 is known

Θ|τ^2 ∼ N(0,τ^2 ).

Obtain the empirical Bayes estimator ofθ.

Free download pdf