SECTION 6.1 Discrete Random Variables 329
Next,
∑∞
n=1
n^2 (1−p)n−^1 =
∑∞
n=1
n(n−1)(1−p)n−^1 +
∑∞
n=1
n(1−p)n−^1
= (1−p)
∑∞
n=1
n(n−1)(1−p)n−^2 +
∑∞
n=1
n(1−p)n−^1
= (1−p)
d^2
dx^2
(1 +x+x^2 +···)
∣∣
∣∣
∣x=1−p+
1
p^2
= (1−p)
d^2
dx^2
(
1
1 −x
)∣∣
∣∣
∣x=1−p+
1
p^2
=
2(1−p)
p^3
+
1
p^2
=
2 −p
p^3
Therefore,
Var(X) =
2 −p
p^2
−
1
p^2
=
1 −p
p^2
6.1.5 The binomial distribution
In this situation we performnindependent trials, where each trial has
two outcomes—call them success and failure. We shall let p be the
probability of success on any trial, so that the probability of failure on
any trial is 1−p. The random variableXis the total number of successes
out of thentrials. This implies, of course, that the distribution of X
is summarized by writing
P(X=k) =
Ñ
n
k
é
pk(1−p)n−k.
The mean and variance ofX are very easily computed once we realize
thatXcan be expressed as a sum ofnindependentBernoullirandom
variables. The Bernoulli random variableBis what models the tossing
of a coin: it has outcomes 0 and 1 with probabilities 1−p and p,
respectively. Very simple calculations shows that
E(B) =p and Var(B) =p(1−p).