You should recall that in Chapter 9 we saw that
(Don’t be confused by the fact that we routinely represent the dependent variable in regres-
sion discussions as Y, and the dependent variable in analysis of variance discussions as X.
It really makes no difference what we call them.) We can apply this formula to the case of
multiple groups by realizing that for each group the predicted score for subjects in that
group is the group mean. Thus we can replace with. Doing this allows us to rewrite
the above equation as follows, substituting for r^2.
Note that I have relabeled as in line with the terminology we use in talking
about the analysis of variance and substituted for.
Since is really , we can rewrite the last expression as
We have now defined in terms of the sums of squares in the summary table of our
analysis of variance.^7 Applying to Eysenck’s data in Table 11.2 we have
The equation for provides a simple way to estimate the maximum squared correla-
tion between the independent variable and the dependent variable.^8 Its derivation also
h^2
h^2 =
SStreatment
SStotal
=
351.52
786.82
=.447
h^2
h^2
h^2 =
SStreatment
SStotal
SStotal 2 SSerror SStreatment
Yj YNij
SSresidual SSerror
h^2 =
SStotal 2 SSerror
SStotal
=
a(Yij^2 Y)
(^22)
a(Yij^2 Yj)
2
a(Yij^2 Y)
2
h^2
YNij Yj
r^2 =
SStotal 2 SSresidual
SStotal
=
a(Yij^2 Y)
(^22)
a(Yij^2 YNij)
2
a(Yij^2 Y)
2
Section 11.11 The Size of an Experimental Effect 345
25
20
15
10
5
0
Recall
1 2 3 4 5
Group
0
2 2
(^33)
2
2
3
2
3
4
2
2
Figure 11.5 Scatter diagram of data in Table 11.2
(^7) You will often see eta-squared given in computer printouts, such as SPSS General Linear Model, though it is
usually labeled R^2.
(^8) Niko Tiliopoulus, of Queen Margaret University College has pointed out that if you only have the Fstatistic, and
its degrees of freedom, you can calculate h^2 directly as h^2 =.
1
11 a
dferror
F 3 dftreatmentb