Advanced High-School Mathematics

(Tina Meador) #1

SECTION 6.1 Discrete Random Variables 323


Lemma. (Markov’s Inequality)Let Xbe a non-negative discrete ran-
dom variable. Then for any numberd > 0 , we have


P(X≥d)≤d^1 E(X).

Proof. We define a new random variableY by setting


Y =





d ifX≥d
0 otherwise.

.

SinceY ≤X, it follows thatE(X)≥E(Y). Also note thatY has two
possible values: 0 andd; furthermore,


E(Y) = dP(Y =d) = dP(X≥d).

SinceE(X)≥E(Y) =dP(X≥d), the result following immediately.


Lemma.(Chebyshev’s Inequality)LetXbe a discrete random variable
with meanμand varianceσ^2. Then for anyd > 0 we have


P(|X−μ|≥d) ≤

σ^2
d^2

.

Proof. Define the random variable Y = (X −μ)^2 ; it follows that
E(Y) =σ^2. Applying Markov’s inequality toY we have


P(|X−μ|≥d) =P(Y ≥d^2 )≤d^12 E(Y) =

σ^2
d^2

,

as required.


We now assume thatX 1 , X 2 , ...,Xnare random variables with the
same meanμ; we denote the average of these random variables thus:


X =

X 1 +X 2 +···+Xn
n

.
Free download pdf