360 Maximum Likelihood Methods
see Remark 5.2.3 for discussion onlim. It follows that for the sequence of solutions
θ̂n,P[|̂θn−θ 0 |<a]→1.
The only contentious point in the proof is that the sequence of solutions might
depend ona. But we can always choose a solution “closest” toθ 0 in the following
way. For eachn, the set of all solutions in the interval is bounded; hence, the
infimum over solutions closest toθ 0 exists.
Note that this theorem is vague in that it discusses solutions of the equation.
If, however, we know that the mle is the unique solution of the equationl′(θ)=0,
then it is consistent. We state this as a corollary:
Corollary 6.1.1. Assume thatX 1 ,...,Xnsatisfy the regularity conditions (R0)
through (R2), whereθ 0 is the true parameter, and thatf(x;θ)is differentiable with
respect toθinΩ. Suppose the likelihood equation has the unique solutionθ̂n.Then
θ̂nis a consistent estimator ofθ 0.
EXERCISES
6.1.1.LetX 1 ,X 2 ,...,Xnbe a random sample onXthat has a Γ(α=4,β=θ)
distribution, 0<θ<∞.
(a)Determinethemleofθ.
(b)Suppose the following data is a realization (rounded) of a random sample on
X. Obtain a histogram with the argumentpr=T(data are inex6111.rda).
9 39 38 23 8 47 21 22 18 10 17 22 14
9 5 26 11 31 15 25 9 29 28 19 8
(c)For this sample, obtainˆθthe realized value of the mle and locate 4θˆon the
histogram. Overlay the Γ(α=4,β=ˆθ) pdf on the histogram. Does the data
agree with this pdf? Code for overlay:
xs=sort(x);y=dgamma(xs,4,1/betahat);hist(x,pr=T);lines(y~xs).
6.1.2.LetX 1 ,X 2 ,...,Xnrepresent a random sample from each of the distributions
having the following pdfs:
(a)f(x;θ)=θxθ−^1 , 0 <x< 1 , 0 <θ<∞, zero elsewhere.
(b)f(x;θ)=e−(x−θ),θ≤x<∞,−∞<θ<∞, zero elsewhere. Note that this
is a nonregular case.
In each case find the mleθˆofθ.
6.1.3.LetY 1 <Y 2 <···<Ynbe the order statistics of a random sample from a
distribution with pdff(x;θ)=1,θ−^12 ≤x≤θ+^12 ,−∞<θ<∞, zero elsewhere.
This is a nonregular case. Show that every statisticu(X 1 ,X 2 ,...,Xn) such that
Yn−^12 ≤u(X 1 ,X 2 ,...,Xn)≤Y 1 +^12