Statistical Methods for Psychology

(Michael S) #1
of the difference. In previous chapters I have made a distinction between the d-family of
measures, which relate directly to differences among means, and the r-family of measures,
which are based on correlations between the independent and dependent variables. When
we are considering the omnibus F, which looks for any differences among the full set of
means, d-family measures may or may not be appropriate, although they do exist and we
will discuss them shortly. They will become very appropriate, however, when we discuss
individual comparisons in Chapter 12. The r-family of measures is often recommended for
the omnibus test of all means, and that is what I will focus on first. I must admit, however,
that I don’t find r-family measures particularly appealing because it is difficult to know
what is a large, or a small, value for that measure. In some situations explaining 5% of the
variation may be very important, while in others 5% might be trivial. Regardless of the type
of measure you choose to use, the most important issue is whether either measure ad-
dresses the important questions in your study. As I will emphasize in the next chapter, out
of four or five means your fundamental interest may lie in comparing just two of them. If
so, a measure that is based on all of the means, while legitimate, may give the right answer
to the wrong question and waste statistical power.
The set of measures discussed here are often classed as “magnitude of effect” meas-
ures and are related to r^2. They represent how much of the overall variability in the de-
pendent variable can be attributed to the treatment effect. At last count, there were at
least six measures of the magnitude of the experimental effect—all different and most
claiming to be less biased than some other measure. In this section we will focus on only
the two most common measures ( and ), because they have the strongest claim to
our attention.

Eta-Squared (h^2 )


Eta-squared is probably the oldest measure of the strength of an experimental effect. Al-
though it is certainly not the best, it has several points to recommend it. As you will see,
eta-squared ( ),sometimes called the correlation ratio, has a certain intuitive appeal.
Moreover, it forms a strong link between the traditional analysis of variance and multiple
regression, as we will see in Chapter 16.
In some textbooks, eta ( ) is defined as the correlation coefficient associated with
curvilinear regression—that is, regression where the best-fitting line is not a straight line.
Suppose that I proposed to calculate the correlation between the recall scores and the treat-
ment levels (counting, rhyming, adjective, imagery, and intentional) for Eysenck’s data
from Table 11.2. The first criticism that would be raised is that the names counting,... ,
intentional are merely labels for treatments and bear no relationship to anything. This
would be true even if we called them treatment 1, 2,... , 5. True enough, but that will not
stop us. The next objection raised might be that the treatments are not ordered on any par-
ticular underlying scale, and therefore we would not know in what order to place them if
we were to plot the data. Again, true enough, and again that will not stop us. The next ob-
jection could be that the regression might not be linear. True again, but we can get around
this problem by calling the coefficient hinstead of r. Having cavalierly brushed aside all
the objections, we set about plotting the data anyway, as shown in Figure 11.5. (The numerals
2, 3, and 4 in Figure 11.5 indicate the number of overlapping data points.) As you may re-
call from high school (though may not), a kth-order polynomial will exactly fit
k 1 1 points, which means that if we did try to fit a fourth-order polynomial to the five
points represented by the treatment means, it would fit perfectly. (This is just an extension
of the phrase “two points determine a straight line.”) We do not particularly care what the
equation would look like, but we can represent the line (as in Figure 11.5) simply by
connecting the array means.

h

h^2

h^2 v^2

344 Chapter 11 Simple Analysis of Variance


magnitude of the
experimental
effect


eta-squared (h^2 )
correlation ratio


curvilinear
regression

Free download pdf