Introduction to Probability and Statistics for Engineers and Scientists

(Sean Pound) #1

460 Chapter 10:Analysis of Variance


To obtain a second estimator ofσ^2 , consider the row averagesXi.,i=1,...,m. Note
that, whenH 0 is true, eachαiis equal to 0, and so


E[Xi.]=μ+αi=μ

Because eachXi.is the average ofnrandom variables, each having varianceσ^2 , it follows
that


Var(Xi.)=σ^2 /n

Thus, we see that whenH 0 is true


∑m

i= 1

(Xi.−E[Xi.])^2 /Var(Xi.)=n

∑m

i= 1

(Xi.−μ)^2 /σ^2

will be chi-square withmdegrees of freedom. If we now substituteX..(the estimator of
μ) forμin the above, then the resulting expression will remain chi-square but with 1 less
degree of freedom. We thus have the following:


when H 0 is true

n

∑m

i= 1

(Xi.−X..)^2 /σ^2

is chi-square withm−1 degrees of freedom.


Definition

The statisticSSris defined by


SSr=n

∑m

i= 1

(Xi.−X..)^2

and is called therow sum of squares.


We saw earlier that whenH 0 is true,SSr/σ^2 is chi-square withm−1 degrees of freedom.
As a result, whenH 0 is true,


E[SSr/σ^2 ]=m− 1

or, equivalently,


E[SSr/(m−1)]=σ^2

In addition, it can be shown thatSSr/(m−1) will tend to be larger thanσ^2 whenH 0 is
not true. Thus, once again we have obtained two estimators ofσ^2. The first estimator,
SSe/(n−1)(m−1), is a valid estimator whether or not the null hypothesis is true, whereas
the second estimator,SSr/(m−1), is only a valid estimator ofσ^2 whenH 0 is true and
tends to be larger thanσ^2 whenH 0 is not true.

Free download pdf