Mathematics for Computer Science

(Frankie) #1

18.3. Chebyshev’s Theorem 621


Rephrasing (18.3.1) in terms of the random variable,jRExŒRçj, that measures
R’s deviation from its mean, we get


PrŒjRExŒRçjxç
ExŒ.RExŒRç/ ̨ç
x ̨

: (18.4)


The case when ̨D 2 is turns out to be so important that numerator of the right
hand side of (18.4) has been given a name:


Definition 18.3.2.Thevariance, VarŒRç, of a random variable,R, is:


VarŒRçWWDEx




.RExŒRç/^2




:


The restatement of (18.4) for ̨D 2 is known asChebyshev’s Theorem.

Theorem 18.3.3(Chebyshev).LetRbe a random variable andx 2 RC. Then


PrŒjRExŒRçjxç

VarŒRç
x^2

:


The expression ExŒ.RExŒRç/^2 çfor variance is a bit cryptic; the best approach
is to work through it from the inside out. The innermost expression,RExŒRç, is
precisely the deviation ofRabove its mean. Squaring this, we obtain,.RExŒRç/^2.
This is a random variable that is near 0 whenRis close to the mean and is a large
positive number whenRdeviates far above or below the mean. So ifRis always
close to the mean, then the variance will be small. IfRis often far from the mean,
then the variance will be large.


18.3.1 Variance in Two Gambling Games


The relevance of variance is apparent when we compare the following two gam-
bling games.
Game A:We win $2 with probability2=3and lose $1 with probability1=3.
Game B:We win $1002 with probability2=3and lose $2001 with probability
1=3.
Which game is better financially? We have the same probability, 2/3, of winning
each game, but that does not tell the whole story. What about the expected return for
each game? Let random variablesAandBbe the payoffs for the two games. For
example,Ais 2 with probability 2/3 and -1 with probability 1/3. We can compute
the expected payoff for each game as follows:


ExŒAçD 2 

2


3


C.1/


1


3


D1;


ExŒBçD 1002 

2


3


C.2001/


1


3


D1:

Free download pdf