squared semipartial correlation, it is the additional amount that LogPctSAT explains rela-
tive tothe amount that Expend left to be explained. For example,
and ,
Schematically, squared multiple, partial, and semipartial correlations can be represented as
In addition,
Why Do We Care About Partial and Semipartial Correlations?
You might ask why we bother to worry about partial and semipartial correlations. What do
they add to what we already know? The answer is that they add a great deal. They allow us
to control for variables that we might perceive as “nuisance” variables, and in so doing al-
low us to make statements of the form “The correlation between Yand Ais .65, after we
control for the influence B.” To take an example from a study that we will discuss later in
the chapter, Leerkes and Crockenberg (1999) were interested in the relationship between
the maternal care a woman received when she was a child and the level of self-confidence
or self-efficacy she feels toward her own mothering skills. Leerkes and Crockenberg asked
whether this relationship was influenced by the fact that those who received high quality
maternal care also showed high levels of self-esteem. Perhaps if we controlled for differ-
ences in self-esteem, the maternal care self-efficacy relationship would disappear. This
is a case where they are partialling out the influence of self-esteem to look at the relation-
ship that remains. Partial and semipartial correlations are a tool to “get our hands around”
a number of confusing relationships.
15.9 Suppressor Variables
Suppose we have a multiple regression problem in which all variables are scored so as to cor-
relate positively with the criterion. Because the scoring of variables is often arbitrary anyway,
this presents no difficulty (if Xis negatively related to Y, C 2 Xwill be positively related to Y,
where Cis any constant). In such a situation, we would expect all the regression coefficients
:
C 1 D=r^202 =the squared correlation between Y (SAT) and X 2 (LogPctSAT)
B 1 C=r^201 =the squared correlation between Y (SAT) and X 1 (Expend)
B 1 C 1 D=R^2 0.12=the squared multiple correlation
D=r^2 0(2.1)=the other squared semipartial correlation
A= 12 R^2 0.12=the residual (unexplained) variation in Y (SAT)
=
r^2 0(1.2)
12 r^202
r^201. 2 =
B
A 1 B
=the squared partial correlation
r^2 0(1.2)=B=the squared semipartial correlation
=.931
r01.2= 1 .866
=
.741
.855
=.866
r^2 01.2=
r^2 0(1.2)
12 r^202
12 r^202 =.855
r^202 = 2 .381^2 =.145
538 Chapter 15 Multiple Regression