Advanced High-School Mathematics

(Tina Meador) #1

376 CHAPTER 6 Inferential Statistics


the sum is divided byn instead of n−1. While the resulting statis-
tic is a biased estimate of the population variance, it does enjoy the
property of being what’s called amaximum-likelihood estimate of
the population variance. A fuller treatment of this can be found in any
reasonably advanced statistics textbook.)


Naturally, if we take a sample of size n from a population having
meanμ and variance σ^2 , we would expect that the sample mean and
variance would at least approximateμandσ^2 , respectively. In practice,
however, given a population we rarely know the population mean and
variance; use use the statisticsxands^2 xin order to estimate them (or
to make hypotheses about them).


∑n
i=1

(Xi−μ)^2 =

∑n
i=1

[(X
1 −X) + (X−μ)
] 2

=
∑n
i=1

[(X
i−X)^2 + 2(Xi−X)(X−μ) + (X−μ)^2
]

=
∑n
i=1

(Xi−X)^2 +n(X−μ)^2 ( since

(Xi−X) = 0.)

Next, sinceE(X) =μ, we haveE(n(X−μ)^2 )=nE((X−μ)^2 )=nVar(X) =σ^2 .Therefore, we
take the expectation of the above random variables:


nσ^2 = E

(∑n

i=1

(Xi−μ)^2

)

= E

(∑n

i=1

(Xi−X)^2 +n(X−μ)^2

)

= E

(∑n

i=1

(Xi−X)^2

)
+E(n(X−μ)^2 )

= E

(∑n

i=1

(Xi−X)^2

)
+σ^2

from which we see that


E

(∑n

i=1

(Xi−X)^2

)
= (n−1)σ^2.

Therefore, we finally arrive at the desired conclusion:


E

(
1
n− 1

∑n
i=1

(Xi−X)^2

)
= σ^2.
Free download pdf