Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
432 Sufficiency

Example 7.4.1.Consider the family of pdfs{h(z;θ):0<θ<∞}. SupposeZ
has a pdf in this family given by

h(z;θ)=

{ 1
θe

−z/θ 0 <z<∞
0elsewhere.

Let us say thatE[u(Z)] = 0 for everyθ>0. That is,


1
θ

∫∞

0

u(z)e−z/θdz=0,θ> 0.

Readers acquainted with the theory of transformations recognize the integral in the
left-hand member as being essentially the Laplace transform ofu(z). In that theory
we learn that the only functionu(z) transforming to a function ofθthat is identically
equal to zero isu(z) = 0, except (in our terminology) on a set of points that has
probability zero for eachh(z;θ),θ>0. That is, the family{h(z;θ):0<θ<∞}
is complete.


Let the parameterθin the pdf or pmff(x;θ),θ∈Ω, have a sufficient statistic
Y 1 =u 1 (X 1 ,X 2 ,...,Xn), whereX 1 ,X 2 ,...,Xnis a random sample from this dis-
tribution. Let the pdf or pmf ofY 1 befY 1 (y 1 ;θ),θ∈Ω. It has been seen that if
there is any unbiased estimatorY 2 (not a function ofY 1 alone) ofθ, then there is at
least one function ofY 1 that is an unbiased estimator ofθ, and our search for a best
estimator ofθmay be restricted to functions ofY 1. Suppose it has been verified
that a certain functionφ(Y 1 ), not a function ofθ, is such thatE[φ(Y 1 )] =θfor all
values ofθ, θ∈Ω. Letψ(Y 1 ) be another function of the sufficient statisticY 1 alone,
so that we also haveE[ψ(Y 1 )] =θfor all values ofθ, θ∈Ω. Hence


E[φ(Y 1 )−ψ(Y 1 )] = 0,θ∈Ω.

If the family{fY 1 (y 1 ;θ):θ∈Ω}is complete, the function ofφ(y 1 )−ψ(y 1 )=0,
except on a set of points that has probability zero. That is, for every other unbiased
estimatorψ(Y 1 )ofθ,wehave
φ(y 1 )=ψ(y 1 )
except possibly at certain special points. Thus, in this sense [namelyφ(y 1 )=ψ(y 1 ),
except on a set of points with probability zero],φ(Y 1 ) is the unique function ofY 1 ,
whichisanunbiasedestimatorofθ. In accordance with the Rao–Blackwell theorem,
φ(Y 1 ) has a smaller variance than every other unbiased estimator ofθ.Thatis,the
statisticφ(Y 1 ) is the MVUE ofθ. This fact is stated in the following theorem of
Lehmann and Scheff ́e.


Theorem 7.4.1(Lehmann and Scheff ́e). LetX 1 ,X 2 ,...,Xn,nafixedpositive
integer, denote a random sample from a distribution that has pdf or pmff(x;θ),θ∈
Ω,letY 1 =u 1 (X 1 ,X 2 ,...,Xn)be a sufficient statistic forθ, and let the family
{fY 1 (y 1 ;θ):θ∈Ω}be complete. If there is a function ofY 1 that is an unbiased
estimator ofθ, then this function ofY 1 is the unique MVUE ofθ. Here “unique” is
used in the sense described in the preceding paragraph.

Free download pdf