Mathematics for Computer Science

(avery) #1

18.5. Linearity of Expectation 765


Proof. DefineRito be the indicator random variable forAi, whereRi.!/D 1 if
w 2 AiandRi.!/D 0 ifw...Ai. LetRDR 1 CR 2 CCRn. Then


ExŒRçD

Xn

iD 1

ExŒRiç (by Linearity of Expectation)

D


Xn

iD 1

PrŒRiD1ç (by Lemma 18.4.2)

D


Xn

iD 1

PrŒAiç: (def of indicator variable)

So whenever you are asked for the expected number of events that occur, all you
have to do is sum the probabilities that each event occurs. Independence is not
needed.


18.5.3 Expectation of a Binomial Distribution


Suppose that we independently flipnbiased coins, each with probabilitypof com-
ing up heads. What is the expected number of heads?
LetJ be the random variable denoting the number of heads. ThenJ has a
binomial distribution with parametersn,p, and


PrŒJDkçD
n
k

!


pk.1p/nk:

Applying equation (18.3), this means that


ExŒJçD

Xn

kD 0

kPrŒJDkçD

Xn

kD 0

k

n
k

!


pk.1p/nk: (18.11)

This sum looks a tad nasty, but linearity of expectation leads to an easy derivation
of a simple closed form. We just expressJas a sum of indicator random variables,
which is easy. Namely, letJibe the indicator random variable for theith coin
coming up heads, that is,


JiWWD

(


1 if theith coin is heads
0 if theith coin is tails:

Then the number of heads is simply


JDJ 1 CJ 2 CCJn:
Free download pdf