Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
7.2. A Sufficient Statistic for a Parameter 423

wherewi=wi(y 1 ,y 2 ,...,yn),i=1, 2 ,...,n.ThepdfofY 1 ,sayfY 1 (y 1 ;θ), is given
by

fY 1 (y 1 ;θ)=

∫∞

−∞

···

∫∞

−∞

g(y 1 ,y 2 ,...,yn;θ)dy 2 ···dyn

= k 1 (y 1 ;θ)

∫∞

−∞

···

∫∞

−∞

|J|k 2 (w 1 ,w 2 ,...,wn)dy 2 ···dyn.

Now the functionk 2 does not depend uponθ,norisθinvolved in either the Jacobian
Jor the limits of integration. Hence the (n−1)-fold integral in the right-hand
member of the preceding equation is a function ofy 1 alone, for example,m(y 1 ).
Thus
fY 1 (y 1 ;θ)=k 1 (y 1 ;θ)m(y 1 ).

Ifm(y 1 )=0,thenfY 1 (y 1 ;θ)=0. Ifm(y 1 )>0, we can write


k 1 [u 1 (x 1 ,x 2 ,...,xn);θ]=

fY 1 [u 1 (x 1 ,...,xn);θ]
m[u 1 (x 1 ,...,xn)]

,

and the assumed factorization becomes


f(x 1 ;θ)···f(xn;θ)=fY 1 [u 1 (x 1 ,...,xn);θ]

k 2 (x 1 ,...,xn)
m[u 1 (x 1 ,...,xn)]

.

Since neither the functionk 2 nor the functionmdepends uponθ, then in accordance
with the definition,Y 1 is a sufficient statistic for the parameterθ.
Conversely, ifY 1 is a sufficient statistic forθ, the factorization can be realized by
taking the functionk 1 to be the pdf ofY 1 , namely, the functionfY 1. This completes
the proof of the theorem.


Note that the assumption of a one-to-one transformation made in the proof is not
needed; see Lehmann (1986) for a more rigorous prrof. This theorem characterizes
sufficiency and, as the following examples show, is usually much easier to work with
than the definition of sufficiency.


Example 7.2.4. LetX 1 ,X 2 ,...,Xndenote a random sample from a distribution
that is∑ N(θ, σ^2 ), −∞<θ<∞, where the varianceσ^2 >0isknown. Ifx=
n
1 xi/n,then


∑n

i=1

(xi−θ)^2 =

∑n

i=1

[(xi−x)+(x−θ)]^2 =

∑n

i=1

(xi−x)^2 +n(x−θ)^2

because

2

∑n

i=1

(xi−x)(x−θ)=2(x−θ)

∑n

i=1

(xi−x)=0.
Free download pdf