18.7. Sums of Random Variables 643
In this case, the probability of exceeding the mean bycExŒTçdecreases as an
exponentially small function of the deviation.
By considering the random variablen T, we can also use the Chernoff Bound
to prove that the probability thatTis much lower than ExŒTçis also exponentially
small.
18.7.7 Murphy’s Law
If the expectation of a random variable is much less than 1, then Markov’s Theorem
implies that there is only a small probability that the variable has a value of 1 or
more. On the other hand, a result that we callMurphy’s Law^7 says that if a random
variable is an independent sum of 0-1-valued variables and has a large expectation,
then there is a huge probability of getting a value of at least 1.
Theorem 18.7.4(Murphy’s Law).LetA 1 ,A 2 ,... ,Anbe mutually independent
events. LetTibe the indicator random variable forAiand define
TWWDT 1 CTC 2 CCTn
to be the number of events that occur. Then
PrŒT D0çe ExŒTç:
Proof.
PrŒT D0çDPrŒA 1 ^A 2 ^^Anç
D
Yn
iD 1
PrŒAiç (by independence ofAi)
D
Yn
iD 1
.1 PrŒAiç/
Yn
iD 1
e PrŒAiç (since 1 xe x)
De
Pn
iD 1 PrŒAiç
De
Pn
iD 1 ExŒTiç (sinceTiis an indicator forAi)
De ExŒTç (linearity of expectation)
(^7) This is in reference and deference to the famous saying that “If something can go wrong, it will
go wrong.”