PROBABILITY
30.5.3 Variance and standard deviation
Thevarianceof a distribution,V[X], also writtenσ^2 , is defined by
V[X]=E
[
(X−μ)^2
]
=
{∑
j(xj−μ)
(^2) f(xj) for a discrete distribution,
∫
(x−μ)^2 f(x)dx for a continuous distribution.
(30.48)
Hereμhas been written for the expectation valueE[X]ofX. As in the case of
the mean, unless the series and the integral in (30.48) converge the distribution
does not have a variance. From the definition (30.48) we may easily derive the
following useful properties ofV[X]. Ifaandbare constants then
(i)V[a]=0,
(ii)V[aX+b]=a^2 V[X].
The variance of a distribution is always positive; its positive square root is
known as thestandard deviationof the distribution and is often denoted byσ.
Roughly speaking,σmeasures the spread (aboutx=μ) of the values thatXcan
assume.
Find the standard deviation of the PDF for thedistance from the origin of the electron
whose wavefunction was discussed in the previous two examples.
Inserting the expression (30.47) for the PDFf(r) into (30.48), the variance of the random
variableRis given by
V[R]=
∫∞
0
(r−μ)^2
4 r^2
a^30
e−^2 r/a^0 dr=
4
a^30
∫∞
0
(r^4 − 2 r^3 μ+r^2 μ^2 )e−^2 r/a^0 dr,
where the meanμ=E[R]=3a 0 /2. Integrating each term in the integrand by parts we
obtain
V[R]=3a^20 − 3 μa 0 +μ^2 =
3 a^20
4
.
Thus the standard deviation of the distribution isσ=
√
3 a 0 /2.
We may also use the definition (30.48) to derive theBienaym ́e–Chebyshev
inequality, which provides a useful upper limit on the probability that random
variableXtakes values outside a given range centred on the mean. Let us consider
the case of a continuous random variable, for which
Pr(|X−μ|≥c)=
∫
|x−μ|≥c
f(x)dx,
where the integral on the RHS extends over all values ofxsatisfying the inequality