Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
678 Bayesian Statistics

11.3.3.Consider Example 11.3.1.

(a)Show thatE(X)=1.5.

(b)Obtain the inverse of the cdf ofXand use it to show how to generateX
directly.

11.3.4.Obtain another 10,000 simulations similar to those discussed at the end of
Example 11.3.1. Use your simulations to obtain a confidence interval forE(X).


11.3.5.Consider Example 11.3.2.


(a)Show that the function given in expression (11.3.10) is a joint, mixed discrete-
continuous pdf.

(b)Show that the random variableY has a Γ(α,1) distribution.

(c)Show that the random variableXhas a negative binomial distribution with
pmf

p(x)=

{
(α+x−1)!
x!(α−1)!^2

−(α+x) x=0, 1 , 2 ,...
0elsewhere.

(d)Show thatE(X)=α.

11.3.6.Write an R function (or usegibbser2.s) for the Gibbs sampler discussed in
Example 11.3.2. Run your function forα= 10,m= 3000, andn= 6000. Compare
your results with those of the authors tabled in the example.


11.3.7.Consider the following mixed discrete-continuous pdf for a random vector
(X, Y) (discussed in Casella and George, 1992):


f(x, y)∝

{(n
x

)
yx+α−^1 (1−y)n−x+β−^1 x=0, 1 ,...,n, 0 <y< 1
0elsewhere,

forα>0andβ>0.

(a)Show that this function is indeed a joint, mixed discrete-continuous pdf by
finding the proper constant of proportionality.

(b)Determine the conditional pdfsf(x|y)andf(y|x).

(c)Write the Gibbs sampler algorithm to generate random samples onXandY.

(d)Determine the marginal distributions ofXandY.

11.3.8.Write an R function for the Gibbs sampler of Exercise 11.3.7. Run your
program forα= 10,β=4,m= 3000, andn= 6000. Obtain estimates (and confi-
dence intervals) ofE(X)andE(Y) and compare them with the true parameters.

Free download pdf