Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
454 Sufficiency

7.7.8.In the notation of Example 7.7.3, show that the mle ofpjplisn−^2 YjYl.
7.7.9.Refer to Example 7.7.4 on sufficiency for the multivariate normal model.
(a)Determine the MVUE of the covariance parametersσij.

(b)Letg=

∑k
i=1aiμi,wherea^1 ,...,akare specified constants. Find the MVUE
forg.
7.7.10.In a personal communication, LeRoy Folks noted that the inverse Gaussian
pdf

f(x;θ 1 ,θ 2 )=

(
θ 2
2 πx^3

) 1 / 2
exp

[
−θ 2 (x−θ 1 )^2
2 θ^21 x

]
, 0 <x<∞, (7.7.9)

whereθ 1 >0andθ 2 >0, is often used to model lifetimes. Find the complete
sufficient statistics for (θ 1 ,θ 2 )ifX 1 ,X 2 ,...,Xnis a random sample from the dis-
tribution having this pdf.


7.7.11.LetX 1 ,X 2 ,...,Xnbe a random sample from aN(θ 1 ,θ 2 ) distribution.


(a)Show thatE[(X 1 −θ 1 )^4 ]=3θ^22.

(b)Find the MVUE of 3θ 22.

7.7.12.LetX 1 ,...,Xnbe a random sample from a distribution of the continuous
type with cdfF(x). Suppose the mean,μ=E(X 1 ), exists. Using Example 7.7.5,
show that the sample mean,X=n−^1


∑n
i=1Xi, is the MVUE ofμ.
7.7.13.LetX 1 ,...,Xnbe a random sample from a distribution of the continuous
type with cdfF(x). Letθ=P(X 1 ≤a)=F(a), whereais known. Show that the
proportionn−^1 #{Xi≤a}is the MVUE ofθ.


7.8 Minimal Sufficiency and Ancillary Statistics

In the study of statistics, it is clear that we want to reduce the data contained in
the entire sample as much as possible without losing relevant information about the
important characteristics of the underlying distribution. That is, a large collection
of numbers in the sample is not as meaningful as a few good summary statistics of
those data. Sufficient statistics, if they exist, are valuable because we know that
the statisticians with those summary measures have as much information as the
statistician with the entire sample. Sometimes, however, there are several sets of
joint sufficient statistics, and thus we would like to find the simplest one of these sets.
For illustration, in a sense, the observationsX 1 ,X 2 ,...,Xn,n>2, of a random
sample fromN(θ 1 ,θ 2 ) could be thought of as joint sufficient statistics forθ 1 andθ 2.
We know, however, that we can useXandS^2 as joint sufficient statistics for those
parameters, which is a great simplification over usingX 1 ,X 2 ,...,Xn,particularly
ifnis large.
In most instances in this chapter, we have been able to find a single sufficient
statistic for one parameter or two joint sufficient statistics for two parameters. Pos-
sibly the most complicated cases considered so far are given in Example 7.7.3, in

Free download pdf