The regression analysis of this model would produce
If there is no significant difference in within-treatment regressions—that is, if the regres-
sion lines are parallel and thus the slopes of the regression lines that could be calculated
for each group separately are homogeneous—called homogeneity of regression—the dele-
tion of the interaction term should produce only a trivial decrement in the percentage of
accountable variation. When we delete the CTterms, we have
The Ftest on this decrement is the usual Ftest on the difference between two models:
Given an Fof 1.03 on 4 and 37 degrees of freedom, we have no basis to reject the assump-
tion of homogeneity of regression (common regression coefficients) within the five treat-
ments. Thus, we can proceed with the analysis on the basis of the revised full model that
does not include the covariate by treatment interaction:
This model will serve as the basis against which we compare reduced models.
The three sets of results of the multiple-regression solutions using (1) the covariate and
dummy treatment variables, (2) just the treatment variables, and then (3) just the covariates
are presented in Table 16.8.
From Table 16.8 you can see that using both the covariate (Pre) and the group member-
ship dummy variates (T 1... T 4 ), the sum of squares for regression ( ) is equal to
82.6435, which is the portion of the total variation that can be accounted for by these two
sets of predictors. You can also see that the residual sum of squares ( ) is 20.1254,
which is the variability that cannot be predicted. In our analysis of covariance summary
table, this will become the sum of squares for error.
When we remove the dummy group membership variates from the equation and use
only the covariate (Pre) as a predictor, drops from 82.6435 to 73.4196. The dif-
ference between with and without the group membership predictors must be the
amount of the sum of squares that can be attributable to treatment over and abovethe
amount that can be explained by the covariate. For our data, this is
This last value is called the adjustedtreatment sum of squares for the analysis of covari-
ance, because it has been adjusted for any effects of the covariate. In this case, it has been
adjusted for the fact that the five groups differed on the pretest measure.
We need one additional term to form our analysis of covariance summary table, and
that is the sum of squares to be attributed to the covariate. There are a number of different
ways to define this term, but the most common is to define it analogously to the way the
adjusted treatment effect was defined. We will attribute to the covariate that portion of the
=9.2239
=82.6435 2 73.4196
SStreat(adj)=SSregressiont,c 2 SSregressionc
SSregression
SSregression
SSresidual
SSregressiont,c
Yij=m1tj 1 c 1 eij
=
(47 2 921)(.8238 2 .8042)
(4)(.1762)
=1.03
F(f 2 r, N 2 f 2 1)=
(N 2 f 2 1)(R^2 t,c,ct 2 R^2 t,c)
(f 2 r)(1 2 R^2 t,c,ct)
R^2 t,c=.8042
R^2 t,c,ct=.8238
Section 16.5 The One-Way Analysis of Covariance 603