Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
7.7. The Case of Several Parameters 447

7.6.11.Consider the situation of the last exercise, but suppose we have the following
two independent random samples: (1)X 1 ,X 2 ,...,Xnis a random sample with the
common pdffX(x)=θ−^1 e−x/θ,forx>0, zero elsewhere, and (2)Y 1 ,Y 2 ,...,Ynis
a random sample with common pdffY(y)=θe−θy,fory>0, zero elsewhere. The
last exercise suggests that, for some constantc,Z=cX/Y might be an unbiased
estimator ofθ^2. Find this constantcand the variance ofZ.
Hint:Show thatX/(θ^2 Y) has anF-distribution.
7.6.12.Obtain the asymptotic distribution of the MVUE in Example 7.6.1 for the
caseθ=1/2.


7.7 TheCaseofSeveralParameters.....................

In many of the interesting problems we encounter, the pdf or pmf may not depend
upon a single parameterθ, but perhaps upon two (or more) parameters. In general,
our parameter space Ω is a subset ofRp, but in many of our examplespis 2.
Definition 7.7.1.LetX 1 ,X 2 ,...,Xndenote a random sample from a distribution
that has pdf or pmff(x;θ),whereθ∈Ω⊂Rp.LetSdenote the support ofX.
LetYbe anm-dimensional random vector of statisticsY=(Y 1 ,...,Ym)′,where
Yi=ui(X 1 ,X 2 ,...,Xn),fori=1,...,m. Denote the pdf or pmf ofYbyfY(y;θ)
fory∈Rm. The random vector of statisticsYisjointly sufficientforθif and
only if ∏
n
i=1f(xi;θ)
fY(y;θ)


=H(x 1 ,x 2 ,...,xn), for allxi∈S,

whereH(x 1 ,x 2 ,...,xn)does not depend uponθ.


In general,m =p, i.e., the number of sufficient statistics does not have to be
the same as the number of parameters, but in most of our examples this is the case.
As may be anticipated, the factorization theorem can be extended. In our nota-
tion it can be stated in the following manner. The vector of statisticsYis jointly
sufficient for the parameterθ∈Ω if and only if we can find two nonnegative func-
tionsk 1 andk 2 such that


∏n

i=1

f(xi;θ)=k 1 (y;θ)k 2 (x 1 ,...,xn), for allxi∈S, (7.7.1)

where the functionk 2 (x 1 ,x 2 ,...,xn) does not depend uponθ.

Example 7.7.1.LetX 1 ,X 2 ,...,Xnbe a random sample from a distribution hav-
ing pdf


f(x;θ 1 ,θ 2 )=

{ 1
2 θ 2 θ^1 −θ^2 <x<θ^1 +θ^2
0elsewhere,
where−∞<θ 1 <∞, 0 <θ 2 <∞.LetY 1 <Y 2 <···<Ynbe the order statistics.
The joint pdf ofY 1 andYnis given by

fY 1 ,Y 2 (y 1 ,yn;θ 1 ,θ 2 )=

n(n−1)
(2θ 2 )n

(yn−y 1 )n−^2 ,θ 1 −θ 2 <y 1 <yn<θ 1 +θ 2 ,
Free download pdf