Mathematics for Computer Science

(Frankie) #1

Chapter 17 Random Variables600


For example,Aicould be the event that theith man gets the right hat back. But
in general, it could be any subset of the sample space, and we are asking for the
expected number of events that will contain a random sample point.


Proof. DefineRito be the indicator random variable forAi, whereRi.!/D 1 if
w 2 AiandRi.!/D 0 ifw...Ai. LetRDR 1 CR 2 CCRn. Then


ExŒRçD

Xn

iD 1

ExŒRiç (by Linearity of Expectation)

D


Xn

iD 1

PrŒRiD1ç (by Lemma 17.4.2)

D


Xn

iD 1

PrŒAiç: (def of indicator variable)

So whenever you are asked for the expected number of events that occur, all you
have to do is sum the probabilities that each event occurs. Independence is not
needed.


17.5.3 Expectation of a Binomial Distribution


Suppose that we independently flipnbiased coins, each with probabilitypof com-
ing up heads. What is the expected number of heads?
LetJ be the random variable denoting the number of heads. ThenJ has a
binomial distribution with parametersn,p, and


PrŒJDkçD

n
k

!


pk.1p/nk:

Applying equation (17.2), this means that


ExŒJçD

Xn

kD 0

kPrŒJDkçD

Xn

kD 0

k

n
k

!


pk.1p/nk: (17.11)

Ouch! This is one nasty looking sum. Let’s try another approach.
Since we have just learned about linearity of expectation for sums of indicator
random variables, maybe Theorem 17.5.4 will be helpful. But how do we expressJ
as a sum of indicator random variables? It turns out to be easy. LetJibe the

Free download pdf