30.8 IMPORTANT DISCRETE DISTRIBUTIONS
Thus
Mi(t)=E
[
etXi
]
=e^0 t×Pr(Xi=0)+e^1 t×Pr(Xi=1)
=1×q+et×p
=pet+q.
From (30.89), it follows that the MGF for the binomial distribution is given by
M(t)=
∏n
i=1
Mi(t)=(pet+q)n. (30.96)
We can now use the moment generating function to derive the mean and
variance of the binomial distribution. From (30.96)
M′(t)=npet(pet+q)n−^1 ,
and from (30.86)
E[X]=M′(0) =np(p+q)n−^1 =np,
where the last equality follows fromp+q=1.
Differentiating with respect totonce more gives
M′′(t)=et(n−1)np^2 (pet+q)n−^2 +etnp(pet+q)n−^1 ,
and from (30.86)
E[X^2 ]=M′′(0) =n^2 p^2 −np^2 +np.
Thus, using (30.87)
V[X]=M′′(0)−
[
M′(0)
] 2
=n^2 p^2 −np^2 +np−n^2 p^2 =np(1−p)=npq.
Multiple binomial distributions
SupposeXandY are twoindependentrandom variables, both of which are
described by binomial distributions with a common probability of successp, but
with (in general) different numbers of trialsn 1 andn 2 ,sothatX∼Bin(n 1 ,p)
andY∼Bin(n 2 ,p). Now consider the random variableZ=X+Y.Wecould
calculate the probability distribution ofZdirectly using (30.60), but it is much
easier to use the MGF (30.96).
SinceXandYare independent random variables, the MGFMZ(t) of the new
variableZ=X+Y is given simply by the product of the individual MGFs
MX(t)andMY(t). Thus, we obtain
MZ(t)=MX(t)MY(t)=(pet+q)n^1 (pet+q)n^1 =(pet+q)n^1 +n^2 ,
which we recognise as the MGF ofZ∼Bin(n 1 +n 2 ,p). HenceZis also described
by a binomial distribution.
This result may be extended to any number of binomial distributions. IfXi,