Introduction to Probability and Statistics for Engineers and Scientists

(Sean Pound) #1

10.5Two-Factor Analysis of Variance: Testing Hypotheses 459


∑m
i= 1 αi=


∑n
j= 1 βj=0. Since the sum of all theαiis equal to 0, it follows that once
we have estimatedm−1oftheαithen we have also estimated the final one. Hence, only
m−1 parameters are to be estimated in order to determine all of the estimatorsαˆi. For
the same reason, onlyn−1oftheβjneed be estimated to determine estimators for all
nof them. Becauseμalso must be estimated, we see that the number of parameters that
need to be estimated is 1+m− 1 +n− 1 =n+m−1. As a result, it follows that


∑m

i= 1

∑n

j= 1

(Xij−ˆμ−ˆαi−βˆj)^2 /σ^2

is a chi-square random variable withnm−(n+m−1)=(n−1)(m−1) degrees of
freedom.
Sinceμˆ=X..,αˆi=Xi.−X..,βˆj=X.j−X.., it follows thatμˆ+ˆαi+βˆj=Xi.+X.j−X..;
thus,
∑m


i= 1

∑n

j= 1

(Xij−Xi.−X.j+X..)^2 /σ^2 (10.5.1)

is a chi-square random variable with (n−1)(m−1) degrees of freedom.


Definition

The statisticSSedefined by


SSe=

∑m

i= 1

∑n

j= 1

(Xij−Xi.−X.j+X..)^2

is called theerror sum of squares.


If we think of the difference between a value and its estimated mean as being an “error,”
thenSSeis equal to the sum of the squares of the errors. SinceSSe/σ^2 is just the expression
in 10.5.1, we see thatSSe/σ^2 is chi-square with (n−1)(m−1) degrees of freedom. Because
the expected value of a chi-square random variable is equal to its number of degrees of
freedom, we have that
E[SSe/σ^2 ]=(n−1)(m−1)
or
E[SSe/(n−1)(m−1)]=σ^2
That is,


SSe/(n−1)(m−1)

is an unbiased estimator ofσ^2.
Suppose now that we want to test the null hypothesis that there is no row effect — that
is, we want to test
H 0 :all theαiare equal to 0
against


H 1 :not all theαiare equal to 0
Free download pdf