Advanced High-School Mathematics

(Tina Meador) #1

324 CHAPTER 6 Inferential Statistics


From what we’ve proved about the mean, we see already thatE(X) =
μ. In case the random variablesX 1 , X 2 , ...,have the same distribution,
theWeak Law of Large Numberssays a bit more:


Lemma.(The Weak Law of Large Numbers)Assume thatX 1 , X 2 ,
...,Xn,..., is an infinite sequence of identically distributed random
variables with meanμ (and having finite varianceσ^2 ). Then for each
 > 0


nlim→∞P

(∣∣
∣∣

X 1 +X 2 +···+Xn
n

−μ

∣∣
∣∣
∣> 

)
= 0.

Proof. We setSn=X 1 +X 2 +···+Xn, and soAn=Sn/nhas mean
μand varianceσ^2 /n. By Chebyshev’s Inequality we have


P

Ç∣∣
∣∣
∣An−μ

∣∣
∣∣
∣≥

å

σ^2
n^2

.

Since >0 is fixed, the result is now obvious.


Notice that an equivalent formulation of the Weak Law of Large
Numbers is the statement that for all >0 we have that


nlim→∞P

(∣∣
∣∣

X 1 +X 2 +···+Xn
n

−μ

∣∣
∣∣
∣≤

)
= 1.

As you might expect, there is also a Strong Law of Large Numbers
which is naively obtained by interchanging the limit and probabilityP;
see the footnote.^6


Exercises



  1. Prove that ifXandY are discrete independent random variables,
    thenE(XY) =E(X)E(Y).Is this result still true ifXandY are
    notindependent?


(^6) That is to say, ifX 1 , X 2 , ...,Xn,...,is an infinite sequence of identically distributed random
variables with meanμ, then
P
ÅX 1 +X 2 +···+Xn
n →μ
ã
= 1.
There is no requirement of finiteness of the variances.

Free download pdf