Introduction to Probability and Statistics for Engineers and Scientists

(Sean Pound) #1

458 Chapter 10:Analysis of Variance


10.5 Two-Factor Analysis of Variance: Testing Hypotheses


Consider the two-factor model in which one has dataXij,i=1,...,mandj=1,...,n.
These data are assumed to be independent normal random variables with a common
varianceσ^2 and with mean values satisfying


E[Xij]=μ+αi+βj

where
∑m


i= 1

αi=

∑n

j= 1

βj= 0

In this section, we will be concerned with testing the hypothesis

H 0 :allαi= 0

against


H 1 :not all theαiare equal to 0

This null hypothesis states that there is no row effect, in that the value of a datum is not
affected by its row factor level.
We will also be interested in testing the analogous hypothesis for columns, that is


H 0 :allβjare equal to 0

against


H 1 :not allβjare equal to 0
To obtain tests for the above null hypotheses, we will apply the analysis of variance
approach in which two different estimators are derived for the varianceσ^2. The first will
always be a valid estimator, whereas the second will only be a valid estimator when the null
hypothesis is true. In addition, the second estimator will tend to overestimateσ^2 when
the null hypothesis is not true.
To obtain our first estimator ofσ^2 , we start with the fact that


∑m

i= 1

∑n

j= 1

(Xij−E[Xij])^2 /σ^2 =

∑m

i= 1

∑n

j= 1

(Xij−μ−αi−βj)^2 /σ^2

is chi-square with nm degrees of freedom. If in the above expression we now
replace the unknown parametersμ,α 1 ,α 2 ,...,αm,β 1 ,β 2 ,...,βnby their estimators
μˆ,αˆ 1 ,αˆ 2 ,...,αˆm,βˆ 1 ,βˆ 2 ,...,βˆn, then it turns out that the resulting expression will remain
chi-square but will lose 1 degree of freedom for each parameter that is estimated. To deter-
mine how many parameters are to be estimated, we must be careful to remember that

Free download pdf