458 Chapter 10:Analysis of Variance
10.5 Two-Factor Analysis of Variance: Testing Hypotheses
Consider the two-factor model in which one has dataXij,i=1,...,mandj=1,...,n.
These data are assumed to be independent normal random variables with a common
varianceσ^2 and with mean values satisfying
E[Xij]=μ+αi+βjwhere
∑m
i= 1αi=∑nj= 1βj= 0In this section, we will be concerned with testing the hypothesisH 0 :allαi= 0against
H 1 :not all theαiare equal to 0This null hypothesis states that there is no row effect, in that the value of a datum is not
affected by its row factor level.
We will also be interested in testing the analogous hypothesis for columns, that is
H 0 :allβjare equal to 0against
H 1 :not allβjare equal to 0
To obtain tests for the above null hypotheses, we will apply the analysis of variance
approach in which two different estimators are derived for the varianceσ^2. The first will
always be a valid estimator, whereas the second will only be a valid estimator when the null
hypothesis is true. In addition, the second estimator will tend to overestimateσ^2 when
the null hypothesis is not true.
To obtain our first estimator ofσ^2 , we start with the fact that
∑mi= 1∑nj= 1(Xij−E[Xij])^2 /σ^2 =∑mi= 1∑nj= 1(Xij−μ−αi−βj)^2 /σ^2is chi-square with nm degrees of freedom. If in the above expression we now
replace the unknown parametersμ,α 1 ,α 2 ,...,αm,β 1 ,β 2 ,...,βnby their estimators
μˆ,αˆ 1 ,αˆ 2 ,...,αˆm,βˆ 1 ,βˆ 2 ,...,βˆn, then it turns out that the resulting expression will remain
chi-square but will lose 1 degree of freedom for each parameter that is estimated. To deter-
mine how many parameters are to be estimated, we must be careful to remember that