Mathematics for Computer Science

(avery) #1

19.6. Sums of Random Variables 815


Chebyshev’s Theorem gives us the stronger result that

PrŒjTExŒTçjcTç

1


c^2

:


The Chernoff Bound gives us an even stronger result, namely, that for anyc > 0,

PrŒTExŒTçcExŒTççe.cln.c/cC1/ExŒTç:

In this case, the probability of exceeding the mean bycExŒTçdecreases as an
exponentially small function of the deviation.
By considering the random variablenT, we can also use the Chernoff Bound
to prove that the probability thatTis muchlowerthan ExŒTçis also exponentially
small.


19.6.8 Murphy’s Law


If the expectation of a random variable is much less than 1, then Markov’s Theorem
implies that there is only a small probability that the variable has a value of 1 or
more. On the other hand, a result that we callMurphy’s Law^8 says that if a random
variable is an independent sum of 0–1-valued variables and has a large expectation,
then there is a huge probability of getting a value of at least 1.


Theorem 19.6.4(Murphy’s Law).LetA 1 ,A 2 ,... ,Anbe mutually independent
events. LetTibe the indicator random variable forAiand define


TWWDT 1 CT 2 CCTn

to be the number of events that occur. Then


PrŒT D0çeExŒTç:

(^8) This is in reference and deference to the famous saying that “If something can go wrong, it
probably will.”

Free download pdf