Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
342 Consistency and Limiting Distributions

Theorem 5.3.1(Central Limit Theorem).LetX 1 ,X 2 ,...,Xndenote the observa-
tions of a random sample from a distribution that has meanμand positive variance
σ^2. Then the random variableYn=(


∑n
i=1Xi−nμ)/


nσ=


n(Xn−μ)/σcon-
verges in distribution to a random variable that has a normal distribution with mean
zero and variance 1.


Proof: For this proof, additionally assume that the mgfM(t)=E(etX)existsfor
−h<t<h. If one replaces the mgf by the characteristic functionφ(t)=E(eitX),
which always exists, then our proof is essentially the same as the proof in a more
advanced course which uses characteristic functions.
The function
m(t)=E[et(X−μ)]=e−μtM(t)


also exists for−h<t<h.Sincem(t)isthemgfforX−μ, it must follow that
m(0) = 1,m′(0) =E(X−μ) = 0, andm′′(0) =E[(X−μ)^2 ]=σ^2 .ByTaylor’s
formula there exists a numberξbetween 0 andtsuch that


m(t)=m(0) +m′(0)t+

m′′(ξ)t^2
2

=1+

m′′(ξ)t^2
2
.

Ifσ^2 t^2 /2 is added and subtracted, then

m(t)=1+

σ^2 t^2
2

+

[m′′(ξ)−σ^2 ]t^2
2

(5.3.1)

Next considerM(t;n), where

M(t;n)=E

[
exp

(
t


Xi−nμ
σ


n

)]

= E

[
exp

(
t
X 1 −μ
σ


n

)
exp

(
t
X 2 −μ
σ


n

)
···exp

(
t
Xn−μ
σ


n

)]

= E

[
exp

(
t
X 1 −μ
σ


n

)]
···E

[
exp

(
t
Xn−μ
σ


n

)]

=

{
E

[
exp

(
t

X−μ
σ


n

)]}n

=

[
m

(
t
σ


n

)]n
, −h<

t
σ


n

<h.

In equation (5.3.1), replacetbyt/σ


nto obtain

m

(
t
σ


n

)
=1+

t^2
2 n
+

[m′′(ξ)−σ^2 ]t^2
2 nσ^2
,

where nowξis between 0 andt/σ



nwith−hσ


n<t<hσ


n. Accordingly,

M(t;n)=

{
1+

t^2
2 n
+

[m′′(ξ)−σ^2 ]t^2
2 nσ^2

}n
.
Free download pdf