Introduction to Probability and Statistics for Engineers and Scientists

(Sean Pound) #1

444 Chapter 10:Analysis of Variance


Definition

The statistic


SSW=

∑m

i= 1

∑n

j= 1

(Xij−Xi.)^2

is called thewithin samples sum of squaresbecause it is obtained by substituting the sample
population means for the population means in expression 10.3. The statistic


SSW/(nm−m)

is an estimator ofσ^2.
Our second estimator ofσ^2 will only be a valid estimator when the null hypothesis is
true. So let us assume thatH 0 is true and so all the population meansμiare equal, say,
μi=μfor alli. Under this condition it follows that themsample meansX 1 .,X 2 .,...,Xm.
will all be normally distributed with the same meanμand the same varianceσ^2 /n. Hence,
the sum of squares of themstandardized variables


Xi.−μ

σ^2 /n

=


n(Xi.−μ)/σ

will be a chi-square random variable withmdegrees of freedom. That is, whenH 0 is true,


n

∑m

i= 1

(Xi.−μ)^2 /σ^2 ∼χm^2 (10.3.3)

Now, when all the population means are equal toμ, then the estimator ofμis the average
of all thenmdata values. That is, the estimator ofμisX.., given by


X..=

∑m
i= 1

∑n
j= 1

Xij

nm

=

∑m
i= 1

Xi.

m

If we now substituteX.. for the unknown parameterμin expression 10.5, it follows,
whenH 0 is true, that the resulting quantity


n

∑m

i= 1

(Xi.−X..)^2 /σ^2
Free download pdf