Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
440 Sufficiency

(a)Show thatY 1 =X 1 +X 2 +···+Xnis a complete sufficient statistic forθ.

(b)Find the functionφ(Y 1 )thatistheMVUEofθ.

(c)LetY 2 =(X 1 +X 2 )/2 and computeE(Y 2 ).

(d)DetermineE(Y 2 |Y 1 =y 1 ).

7.5.12.LetX 1 ,X 2 ,...,Xnbe a random sample from a distribution with pmf
p(x;θ)=θx(1−θ),x=0, 1 , 2 ,..., zero elsewhere, where 0≤θ≤1.

(a)Find the mle,θˆ,ofθ.

(b)Show that

∑n
1 Xiis a complete sufficient statistic forθ.
(c)Determine the MVUE ofθ.

7.6 FunctionsofaParameter

Up to this point we have sought an MVUE of a parameterθ. Not always, however,
are we interested inθbut rather in a function ofθ. There are several techniques
we can use to the find the MVUE. One is by inspection of the expected value of a
sufficient statistic. This is how we found the MVUEs in Examples 7.5.2 and 7.5.3
of the last section. In this section and its exercises, we offer more examples of the
inspection technique. The second technique is based on the conditional expectation
of an unbiased estimate given a sufficient statistic. The third example illustrates
this technique.
Recall that in Chapter 6 under regularity conditions, we obtained the asymptotic
distribution theory for maximum likelihood estimators (mles). This allows certain
asymptotic inferences (confidence intervals and tests) for these estimators. Such
a straightforward theory is not available for MVUEs. As Theorem 7.3.2 shows,
though, sometimes we can determine the relationship between the mle and the
MVUE. In these situations, we can often obtain the asymptotic distribution for the
MVUE based on the asymptotic distribution of the mle. Also, as we discuss in
Section 7.6.1, we can usually make use of the bootstrap to obtain standard errors
for MVUE estimates. We illustrate this for some of the following examples.

Example 7.6.1.LetX 1 ,X 2 ,...,Xndenote the observations of a random sample
of sizen>1 from a distribution that isb(1,θ), 0 <θ<1. We know that if
Y =


∑n
1 Xi,thenY/nis the unique minimum variance unbiased estimator ofθ.
Now suppose we want to estimate the variance ofY/n,whichisθ(1−θ)/n.Let
δ=θ(1−θ). BecauseYis a sufficient statistic forθ, it is known that we can restrict
our search to functions ofY. The maximum likelihood estimate ofδ,whichisgiven
by ̃δ=(Y/n)(1−Y/n), is a function of the sufficient statistic and seems to be a
reasonable starting point. The expectation of this statistic is given by


E[ ̃δ]=E

[
Y
n

(
1 −

Y
n

)]
=

1
n
E(Y)−

1
n^2
E(Y^2 ).
Free download pdf