Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
422 Sufficiency

where 0<xi<∞,i=1, 2 ,...,n. Since this ratio does not depend uponθ,the
sumY 1 is a sufficient statistic forθ.


Example 7.2.3.LetY 1 <Y 2 <···<Yndenote the order statistics of a random
sample of sizenfrom the distribution with pdf


f(x;θ)=e−(x−θ)I(θ,∞)(x).

Here we use the indicator function of a setAdefined by


IA(x)=

{
1 x∈A
0 x
∈A.

This means, of course, thatf(x;θ)=e−(x−θ),θ<x<∞, zero elsewhere. The pdf
ofY 1 =min(Xi)is
fY 1 (y 1 ;θ)=ne−n(y^1 −θ)I(θ,∞)(y 1 ).


Note thatθ<min{xi}if and only ifθ<xi, for alli=1,...,n. Notationally this
can be expressed asI(θ,∞)(minxi)=


∏n
i=1I(θ,∞)(xi). Thus we have that
∏n
i=1e

−(xi−θ)I
(θ,∞)(xi)
ne−n(minxi−θ)I(θ,∞)(minxi)

=

e−x^1 −x^2 −···−xn
ne−nminxi

.

Since this ratio does not depend uponθ, the first order statisticY 1 is a sufficient
statistic forθ.


If we are to show by means of the definition that a certain statisticY 1 is or is not
a sufficient statistic for a parameterθ,wemustfirstofallknowthepdfofY 1 ,say
fY 1 (y 1 ;θ). In many instances it may be quite difficult to find this pdf. Fortunately,
this problem can be avoided if we prove the followingfactorization theoremof
Neyman.

Theorem 7.2.1(Neyman). LetX 1 ,X 2 ,...,Xndenote a random sample from a
distribution that has pdf or pmff(x;θ),θ∈Ω. The statisticY 1 =u 1 (X 1 ,...,Xn)
is a sufficient statistic forθif and only if we can find two nonnegative functions,
k 1 andk 2 , such that


f(x 1 ;θ)f(x 2 ;θ)···f(xn;θ)=k 1 [u 1 (x 1 ,x 2 ,...,xn);θ]k 2 (x 1 ,x 2 ,...,xn), (7.2.1)

wherek 2 (x 1 ,x 2 ,...,xn)does not depend uponθ.

Proof. We shall prove the theorem when the random variables are of the con-
tinuous type. Assume that the factorization is as stated in the theorem. In our
proof we shall make the one-to-one transformationy 1 =u 1 (x 1 ,x 2 ,...,xn),y 2 =
u 2 (x 1 ,x 2 ,...,xn),...,yn =un(x 1 ,x 2 ,...,xn) having the inverse functionsx 1 =
w 1 (y 1 ,y 2 ,...,yn),x 2 =w 2 (y 1 ,y 2 ,...,yn),...,xn=wn(y 1 ,y 2 ,...,yn)andJaco-
bianJ; see the note after the proof. The pdf of the statisticY 1 ,Y 2 ,...,Ynis then
given by
g(y 1 ,y 2 ,...,yn;θ)=k 1 (y 1 ;θ)k 2 (w 1 ,w 2 ,...,wn)|J|,

Free download pdf