30.9 IMPORTANT CONTINUOUS DISTRIBUTIONS
Now using Φ(−z)=1−Φ(z)gives
Φ
(
μ− 140
σ
)
=1− 0 .030 = 0. 970.
Using table 30.3 again, we find
μ− 140
σ
=1. 88. (30.113)
Solving the simultaneous equations (30.112) and (30.113) givesμ= 173.5,σ=17.8.
The moment generating function for the Gaussian distribution
Using the definition of the MGF (30.85),
MX(t)=E
[
etX
]
=
∫∞
−∞
1
σ
√
2 π
exp
[
tx−
(x−μ)^2
2 σ^2
]
dx
=cexp
(
μt+^12 σ^2 t^2
)
,
where the final equality is established by completing the square in the argument
of the exponential and writing
c=
∫∞
−∞
1
σ
√
2 π
exp
{
−
[x−(μ+σ^2 t)]^2
2 σ^2
}
dx.
However, the final integral is simply the normalisation integral for the Gaussian
distribution, and soc= 1 and the MGF is given by
MX(t)=exp
(
μt+^12 σ^2 t^2
)
. (30.114)
We showed in subsection 30.7.2 that this MGF leads toE[X]=μandV[X]=σ^2 ,
as required.
Gaussian approximation to the binomial distribution
We may consider the Gaussian distribution as the limit of the binomial distribu-
tion when the number of trialsn→∞but the probability of a successpremains
finite, so thatnp→∞also. (This contrasts with the Poisson distribution, which
corresponds to the limitn→∞andp→0 withnp=λremaining finite.) In
other words, a Gaussian distribution results when an experiment with a finite
probability of success is repeated a large number of times. We now show how
this Gaussian limit arises.
The binomial probability function gives the probability ofxsuccesses inntrials
as
f(x)=
n!
x!(n−x)!
px(1−p)n−x.
Taking the limit asn→∞(andx→∞) we may approximate the factorials by
Stirling’s approximation
n!∼
√
2 πn
(n
e
)n