The Essentials of Biostatistics for Physicians, Nurses, and Clinicians

(Ann) #1
3.8 Measures of Dispersion 49

(i.e., are independent and identically distributed normal random vari-
ables), the unbiasedness of S 2 actually holds more generally. We shall
now discuss some properties of the mean and standard deviation to
explain why these two measures are sometimes preferred. This is again
illustrated nicely by a few cartoons (Figs. 3.8 and 3.9 ).
We call this an empirical rule because it was discovered by looking
at mound - shaped data. It works because mound - shaped data look
approximately like samples from the normal distribution, and the
normal distribution has exactly those percentages given in the rule. If
a distribution has a variance, * the Chebyshev inequality gives a lower
bound on the percentage of cases within k standard deviations of
the mean.



  • A variance is defi ned for any fi nite population or fi nite sample. However, if a distribution
    has an infi nite range the distribution (or infi nite population) does not necessarily have a
    fi nite variance. We require μ = ∫ x f ( x ) dx < ∞ and σ 2 = ∫ ( x − μ )^2 f ( x ) dx < ∞ for the distribu-
    tion with density f to have a fi nite variance.


Figure 3.9. The empirical rule for mound-shaped distributions (taken from
the Cartoon Guide to Statistics with permission).
Free download pdf