Mathematics for Computer Science

(Frankie) #1

18.7. Sums of Random Variables 643


In this case, the probability of exceeding the mean bycExŒTçdecreases as an
exponentially small function of the deviation.
By considering the random variablenT, we can also use the Chernoff Bound
to prove that the probability thatTis much lower than ExŒTçis also exponentially
small.


18.7.7 Murphy’s Law


If the expectation of a random variable is much less than 1, then Markov’s Theorem
implies that there is only a small probability that the variable has a value of 1 or
more. On the other hand, a result that we callMurphy’s Law^7 says that if a random
variable is an independent sum of 0-1-valued variables and has a large expectation,
then there is a huge probability of getting a value of at least 1.


Theorem 18.7.4(Murphy’s Law).LetA 1 ,A 2 ,... ,Anbe mutually independent
events. LetTibe the indicator random variable forAiand define


TWWDT 1 CTC 2 CCTn

to be the number of events that occur. Then


PrŒT D0çeExŒTç:

Proof.


PrŒT D0çDPrŒA 1 ^A 2 ^^Anç

D

Yn

iD 1

PrŒAiç (by independence ofAi)

D


Yn

iD 1

.1PrŒAiç/




Yn

iD 1

ePrŒAiç (since 1 xex)

De

Pn
iD 1 PrŒAiç

De

Pn
iD 1 ExŒTiç (sinceTiis an indicator forAi)
DeExŒTç (linearity of expectation) 

(^7) This is in reference and deference to the famous saying that “If something can go wrong, it will
go wrong.”

Free download pdf