Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
1.10. Important Inequalities 79

which is the the desired result.

Theorem 1.10.2(Markov’s Inequality).Letu(X)be a nonnegative function of the
random variableX.IfE[u(X)]exists, then for every positive constantc,


P[u(X)≥c]≤

E[u(X)]
c

.

Proof. The proof is given when the random variableXis of the continuous type;
but the proof can be adapted to the discrete case if we replace integrals by sums.
LetA={x:u(x)≥c}and letf(x)denotethepdfofX.Then


E[u(X)] =

∫∞

−∞

u(x)f(x)dx=


A

u(x)f(x)dx+


Ac

u(x)f(x)dx.

Since each of the integrals in the extreme right-hand member of the preceding
equation is nonnegative, the left-hand member is greater than or equal to either of
them. In particular,


E[u(X)]≥


A

u(x)f(x)dx.

However, ifx∈A,thenu(x)≥c; accordingly, the right-hand member of the
preceding inequality is not increased if we replaceu(x)byc.Thus


E[u(X)]≥c


A

f(x)dx.

Since ∫


A

f(x)dx=P(X∈A)=P[u(X)≥c],

it follows that


E[u(X)]≥cP[u(X)≥c],

which is the desired result.


The preceding theorem is a generalization of an inequality that is often called
Chebyshev’s Inequality. This inequality we now establish.


Theorem 1.10.3(Chebyshev’s Inequality).LetXbe a random variable with finite
varianceσ^2 (by Theorem 1.10.1, this implies that the meanμ=E(X)exists). Then
for everyk> 0 ,


P(|X−μ|≥kσ)≤
1
k^2

, (1.10.2)

or, equivalently,


P(|X−μ|<kσ)≥ 1 −
1
k^2

.
Free download pdf