428 Sufficiency
Example 7.3.1.LetX 1 ,...,Xnbe iid with pdf
f(x;θ)=
{
θe−θx 0 <x<∞,θ> 0
0elsewhere.
Suppose we want an MVUE ofθ. The joint pdf (likelihood function) is
L(θ;x 1 ,...,xn)=θne−θ
Pn
i=1xi, forxi>0,i=1,...,n.
Hence, by the factorization theorem, the statisticY 1 =
∑n
i=1Xiis sufficient. The
log of the likelihood function is
l(θ)=nlogθ−θ
∑n
i=1
xi.
Taking the partial with respect toθofl(θ) and setting it to 0 results in the mle of
θ,whichisgivenby
Y 2 =
1
X
.
Note thatY 2 =n/Y 1 is a function of the sufficient statisticY 1 .Also,sinceY 2 is the
mle ofθ, it is asymptotically unbiased. Hence, as a first step, we shall determine
its expectation. In this problem,Xiare iid Γ(1, 1 /θ) random variables; hence,
Y 1 =
∑n
i=1Xiis Γ(n,^1 /θ). Therefore,
E(Y 2 )=E
[
1
X
]
=nE
[
1
∑n
i=1Xi
]
=n
∫∞
0
θn
Γ(n)
t−^1 tn−^1 e−θtdt;
making the change of variablez=θtand simplifying results in
E(Y 2 )=E
[
1
X
]
=θ
n
(n−1)!
Γ(n−1) =θ
n
n− 1
.
Thus the statistic [(n−1)Y 2 ]/n=(n−1)/
∑n
i=1Xiis an MVUE ofθ.
In the next two sections, we discover that, in most instances, if there is one
functionφ(Y 1 ) that is unbiased,φ(Y 1 ) is the only unbiased estimator based on the
sufficient statisticY 1.
Remark 7.3.1.Since the unbiased estimatorφ(Y 1 ), whereφ(Y 1 )=E(Y 2 |y 1 ), has
a variance smaller than that of the unbiased estimatorY 2 ofθ, students sometimes
reason as follows. Let the function Υ(y 3 )=E[φ(Y 1 )|Y 3 =y 3 ], whereY 3 is another
statistic, which is not sufficient forθ. By the Rao–Blackwell theorem, we have
E[Υ(Y 3 )] =θand Υ(Y 3 ) has a smaller variance than doesφ(Y 1 ). Accordingly,
Υ(Y 3 )mustbebetterthanφ(Y 1 ) as an unbiased estimator ofθ. But this isnottrue,
becauseY 3 is not sufficient; thus,θis present in the conditional distribution ofY 1 ,
givenY 3 =y 3 , and the conditional mean Υ(y 3 ). So although indeedE[Υ(Y 3 )] =θ,
Υ(Y 3 ) is not even a statistic because it involves the unknown parameterθand hence
cannotbeusedasanestimate.