Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
6.2. Rao–Cram ́er Lower Bound and Efficiency 371

the sample meanXis asymptotically normal with meanθand varianceσ^2 /n,where
σ^2 =Var(Xi)=Var(ei+θ)=Var(ei)=E(e^2 i). But

E(e^2 i)=

∫∞

−∞

z^22 −^1 exp{−|z|}dz=

∫∞

0

z^3 −^1 exp{−z}dz=Γ(3)=2.

Therefore, the ARE(Q 2 ,X)=^21 = 2. Thus, if the sample comes from a Laplace
distribution, then asymptotically the sample median is twice as efficient as the
sample mean.
Next suppose the location model (6.2.28) holds, except now the pdf ofeiis
N(0,1). Under this model, by Theorem 10.2.3,Q 2 is asymptotically normal with
meanθand variance (π/2)/n. Because the variance ofXis 1/n,inthiscase,the
ARE(Q 2 ,X)=π/^12 =2/π=0.636. Sinceπ/2=1.57, asymptotically,Xis 1.57
times more efficient thanQ 2 if the sample arises from the normal distribution.


Theorem 6.2.2 is also a practical result in that it gives us a way of doing inference.
The asymptotic standard deviation of the mleθ̂is [nI(θ 0 )]−^1 /^2. BecauseI(θ)isa
continuous function ofθ, it follows from Theorems 5.1.4 and 6.1.2 that


I(̂θn)
P
→I(θ 0 ).

Thus we have a consistent estimate of the asymptotic standard deviation of the mle.
Based on this result and the discussion of confidence intervals in Chapter 4, for a
specified 0<α<1, the following interval is an approximate (1−α)100% confidence
interval forθ, ⎛


⎝θ̂n−zα/ 2 √^1
nI(̂θn)

,̂θn+zα/ 2

1

nI(θ̂n)


⎠. (6.2.29)

Remark 6.2.2. If we use the asymptotic distributions to construct confidence
intervals forθ, the fact that theARE(Q 2 ,X) = 2 when the underlying distribution
is the Laplace means thatnwould need to be twice as large forXto get the same
length confidence interval as we would if we usedQ 2.


A simple corollary to Theorem 6.2.2 yields the asymptotic distribution of a
functiong(̂θn)ofthemle.


Corollary 6.2.2.Under the assumptions of Theorem 6.2.2, supposeg(x)is a con-
tinuous function ofxthat is differentiable atθ 0 such thatg′(θ 0 ) =0.Then



n(g(̂θn)−g(θ 0 ))→DN

(
0 ,
g′(θ 0 )^2
I(θ 0 )

)

. (6.2.30)


The proof of this corollary follows immediately from the Δ-method, Theorem
5.2.9, and Theorem 6.2.2.
The proof of Theorem 6.2.2 contains an asymptotic representation ofθ̂which
proves useful; hence, we state it as another corollary.

Free download pdf