Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
442 Sufficiency

Solving foru(θ), we obtain

u(θ)=g(θ)+
θg′(θ)
n

.

Therefore, the MVUE ofg(θ)is


u(Yn)=g(Yn)+

Yn
n

g′(Yn). (7.6.2)

For example, ifg(θ)=θ,then


u(Yn)=Yn+

Yn
n

=

n+1
n

Yn,

which agrees with the result obtained in Example 7.4.2. Other examples are given
in Exercise 7.6.5.

A somewhat different but also very important problem in point estimation is
considered in the next example. In the example the distribution of a random variable
Xis described by a pdff(x;θ) that depends uponθ∈Ω. The problem is to estimate
the fractional part of the probability for this distribution, which is at, or to the left
of, a fixed pointc. Thus we seek an MVUE ofF(c;θ), whereF(x;θ)isthecdfof
X.


Example 7.6.3.LetX 1 ,X 2 ,...,Xnbe a random sample of sizen>1froma
distribution that isN(θ,1). Suppose that we wish to find an MVUE of the function
ofθdefined by


P(X≤c)=

∫c

−∞

1

2 π

e−(x−θ)

(^2) / 2
dx=Φ(c−θ),
wherecis a fixed constant. There are many unbiased estimators of Φ(c−θ). We first
exhibit one of these, sayu(X 1 ), a function ofX 1 alone. We shall then compute the
conditional expectation,E[u(X 1 )|X=x]=φ(x), of this unbiased statistic, given
the sufficient statisticX, the mean of the sample. In accordance with the theorems
of Rao–Blackwell and Lehmann–Scheff ́e,φ(X) is the unique MVUE of Φ(c−θ).
Consider the functionu(x 1 ), where
u(x 1 )=
{
1 x 1 ≤c
0 x 1 >c.
The expected value of the random variableu(X 1 )isgivenby
E[u(X 1 )] = 1·P[X 1 −θ≤c−θ]=Φ(c−θ).
That is,u(X 1 ) is an unbiased estimator of Φ(c−θ).
We shall next discuss the joint distribution ofX 1 andXand the conditional
distribution ofX 1 ,givenX=x. This conditional distribution enables us to compute
the conditional expectationE[u(X 1 )|X=x]=φ(x). In accordance with Exercise

Free download pdf