Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
328 Consistency and Limiting Distributions

the asymptotics of certain situations. Moreover, for illustration, instead of saying
Xn→DX,whereXhas a standard normal distribution, we may write

Xn
D
→N(0,1)

as an abbreviated way of saying the same thing. Clearly, the right-hand member
of this last expression is a distribution and not a random variable as it should be,
but we will make use of this convention. In addition, we may say thatXnhas

alimitingstandard normal distribution to mean thatXn
D
→X,whereX has a


standard normal random, or equivalentlyXn→DN(0,1).


Motivation for considering only points of continuity ofFXis given by the fol-
lowing simple example. LetXnbe a random variable with all its mass atn^1 and
letXbe a random variable with all its mass at 0. Then, as Figure 5.2.1 shows,
all the mass ofXnis converging to 0, i.e., the distribution ofX. Atthepointof
discontinuity ofFX, limFXn(0) = 0 =1=FX(0), while at continuity pointsxof


FX(i.e.,x = 0), limFXn(x)=FX(x). Hence, according to the definition,Xn
D
→X.


Fxn(x)

x
n–^1

1

(0, 0)

Figure 5.2.1:Cdf ofXn, that has all its mass atn−^1.

Convergence in probability is a way of saying that a sequence of random variables
Xnis getting close to another random variableX. On the other hand, convergence
in distribution is only concerned with the cdfsFXn andFX.Asimpleexample
illustrates this. LetX be a continuous random variable with a pdffX(x)that
is symmetric about 0; i.e.,fX(−x)=fX(x). Then it is easy to show that the
density of the random variable−Xis alsofX(x). Thus,Xand−Xhave the same
distributions. Define the sequence of random variablesXnas


Xn=

{
X ifnis odd
−X ifnis even. (5.2.1)
Free download pdf