Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
7.8. Minimal Sufficiency and Ancillary Statistics 455

which we findk+k(k+1)/2 joint sufficient statistics fork+k(k+1)/2 parameters;
or the multivariate normal distribution given in Example 7.7.4; or in the use the
order statistics of a random sample for some completely unknown distribution of
the continuous type as in Example 7.7.5.
What we would like to do is to change from one set of joint sufficient statistics
to another, always reducing the number of statistics involved until we cannot go
any further without losing the sufficiency of the resulting statistics. Those statistics
that are there at the end of this reduction are calledminimal sufficient statis-
tics. These are sufficient for the parameters and are functions of every other set
of sufficient statistics for those same parameters. Often, if there arekparameters,
we can findkjoint sufficient statistics that are minimal. In particular, if there is
one parameter, we can often find a single sufficient statistic that is minimal. Most
of the earlier examples that we have considered illustrate this point, but this is not
always the case, as shown by the following example.


Example 7.8.1.LetX 1 ,X 2 ,...,Xnbe a random sample from the uniform distri-
bution over the interval (θ− 1 ,θ+1) having pdf


f(x;θ)=(^12 )I(θ− 1 ,θ+1)(x), where−∞<θ<∞.

The joint pdf ofX 1 ,X 2 ,...,Xnequals the product of (^12 )nand certain indicator
functions, namely,
(
1
2


)n∏n

i=1

I(θ− 1 ,θ+1)(xi)=

(
1
2

)n
{I(θ− 1 ,θ+1)[min(xi)]}{I(θ− 1 ,θ+1)[max(xi)]},

becauseθ− 1 <min(xi)≤xj≤max(xi)<θ+1,j=1, 2 ,...,n. Thus the order
statisticsY 1 =min(Xi)andYn=max(Xi) are the sufficient statistics forθ.These
two statistics actually are minimal for this one parameter, as we cannot reduce the
number of them to less than two and still have sufficiency.


There is an observation that helps us see that almost all the sufficient statistics
that we have studied thus far are minimal. We have noted that the mleθˆofθis
a function of one or more sufficient statistics, when the latter exists. Suppose that
this mleθˆis also sufficient. Since this sufficient statisticθˆis a function of the other
sufficient statistics, by Theorem 7.3.2, it must be minimal. For example, we have



  1. The mleˆθ=XofθinN(θ, σ^2 ),σ^2 known, is a minimal sufficient statistic
    forθ.

  2. The mleθˆ=X ofθin a Poisson distribution with meanθis a minimal
    sufficient statistic forθ.

  3. The mleθˆ=Yn=max(Xi)ofθin the uniform distribution over (0,θ)isa
    minimal sufficient statistic forθ.

  4. The maximum likelihood estimatorsθˆ 1 =Xandθˆ 2 =[(n−1)/n]S^2 ofθ 1 and
    θ 2 inN(θ 1 ,θ 2 ) are joint minimal sufficient statistics forθ 1 andθ 2.

Free download pdf