Mathematics for Computer Science

(Frankie) #1

17.5. Linearity of Expectation 601


indicator random variable for theith coin coming up heads, that is,


JiWWD

(


1 if theith coin is heads
0 if theith coin is tails:

Then the number of heads is simply


JDJ 1 CJ 2 CCJn:

By Theorem 17.5.4,


ExŒJçD

Xn

iD 1

PrŒJiçDpn: (17.12)

That really was easy. If we flipnmutually independent coins, we expect to get
pnheads. Hence the expected value of a binomial distribution with parametersn
andpis simplypn.
But what if the coins are not mutually independent? It doesn’t matter—the an-
swer is stillpnbecause Linearity of Expectation and Theorem 17.5.4 do not as-
sume any independence.
If you are not yet convinced that Linearity of Expectation and Theorem 17.5.4
are powerful tools, consider this: without even trying, we have used them to prove
a complicated looking identity, namely,


Xn

kD 0

k

n
k

!


pk.1p/nkDpn; (17.13)

which follows by combining equations (17.11) and (17.12).^4
The next section has an even more convincing illustration of the power of linear-
ity to solve a challenging problem.


(^4) Equation (17.13) may look daunting initially, but it is, after all, pretty similar to the binomial
identity, and that connection leads to a simple derivation by algebra. Namely, starting with the bino-
mial identity
.xCy/nD
Xn
kD 0
n
k
!
xkynk:
we can differentiate with respect tox(as in Section 14.1.6) to get
n.xCy/n^1 D
Xn
kD 0
k
n
k
!
xk^1 ynk:
Multiplying both sides byxgives
xn.xCy/n^1 D
Xn
kD 0
k
n
k
!
xkynk (17.14)
Pluggingpforxand 1 pforyin (17.14) then yields (17.13).

Free download pdf