Statistical Methods for Psychology

(Michael S) #1
Exhibit 16.2 (continued)

Parameter Estimates
Parameter Standard
Variable DF Estimate Error t Value Pr > | t|
Intercep 1 9.34375 0.42581 21.94 <.0001
A1 1 –0.40625 0.42581 –0.95 0.3496
B1 1 –1.34375 0.73753 –1.82 0.0809
B2 1 –3.34375 0.73753 –4.53 0.0001
B3 1 1.65625 0.73753 2.25 0.0342
AB11 1 –0.34375 0.73753 –0.47 0.6454
AB12 1 –1.34375 0.73753 –1.82 0.0809
AB13 1 0.65625 0.73753 0.89 0.3824

Reduced Models


At this point we know only the amount of variation that can be accounted for by all of the
predictors simultaneously. What we wish to know is how this variation can be partitioned
among A, B, and AB. This information can be readily obtained by computing several re-
duced regression equations.
Since in the subsequent course of the analysis we must compute several multiple re-
gression sums of squares relating to the different effects, we will change our notation and
use the effect labels (a, b, and ab) as subscripts. For the multiple regression just computed,
the model contained variables to account for a, b, and ab. Thus we will designate the sum
of squares regression in that solution as. If we dropped the last three predic-
tors (AB 11 , AB 12 , and AB 13 ) we would be deleting those predictors carrying information
concerning the interaction but would retain those predictors concerned with aand b. Thus,
we would use the designation. If we used only A, AB 11 , AB 12 , and AB 13 as pre-
dictors, the model would account for only aand aband the result would be denoted
.
I have run the individual regression solutions for our example, and the results are

Now this is the important part. If the interaction term accounts for any of the variation in Y,
then removing the interaction predictors from the model should lead to a decrease in ac-
countable variation. This decrease will be equal to the variation that can be attributable to
the interaction. By this and similar reasoning,

The relevant calculations are presented in Table 16.3. (I leave it to you to verify that these
are the sums of squares for regression that result when we use the relevant predictors.)

SSB=SSregressiona,b,ab 2 SSregressiona,ab

SSA=SSregressiona,b,ab 2 SSregressionb,ab

SSAB=SSregressiona,b,ab 2 SSregressiona,b

SSregressiona,ab=32.635

SSregressionb,ab=226.687

SSregressiona,b=204.625

SSregressiona,b,ab=231.969

SSregressiona,ab

SSregressiona,b

SSregressiona,b,ab

Section 16.3 Factorial Designs 591
Free download pdf