Statistical Methods for Psychology

(Michael S) #1
variances and. We now draw pairs of samples of size n 1 from population and of
size n 2 from population , and record the means and the difference between the means for
each pair of samples. Because we are sampling independently from each population, the
sample means will be independent. (Means are paired only in the trivial and presumably ir-
relevant sense of being drawn at the same time.) The results of an infinite number of repli-
cations of this procedure are presented schematically in Figure 7.8. In the lower portion of
this figure, the first two columns represent the sampling distributions of and , and the
third column represents the sampling distribution of mean differences ( ). We are
most interested in the third column since we are concerned with testing differences be-
tween means. The mean of this distribution can be shown to equal The variance
of this distribution of differences is given by what is commonly called the variance sum
law,a limited form of which states,
The variance of a sum or difference of two independentvariables is equal to the sum of
their variances.^9
We know from the central limit theorem that the variance of the distribution of is
and the variance of the distribution of is. Since the variables (sample
means) are independent, the variance of the difference of these two variables is the sum of
their variances. Thus

sX^212 X 2 =sX^21 1sX^22 =

s^21
n 1

1


s^22
n 2

s^21 >n 1 X 2 s^22 >n 2

X 1


m 1 2m 2.

X 12 X 2


X 1 X 2


X 2


s^21 s^22 X 1

204 Chapter 7 Hypothesis Tests Applied to Means


X 1 X 2

X 1 X 2 X 1 −X 2

Mean −

Variance

2
n 1

2
n 2

2
n 1 +

2
n 2

S.D.
n 1 n 2

2
n 1
+

2
n 2

μ 1
1122

1 2 1 2

μ 2 μ 1 μ 2

X 11 X 21 X 11 −X 21
X 12 X 22 X 12 −X 22
X 13 X 23 X 13 −X 23

Figure 7.8 Schematic set of means and mean differences
when sampling from two populations

(^9) The complete form of the law omits the restriction that the variables must be independent and states that the
variance of their sum or difference is where the notation 6 is interpreted as plus
when we are speaking of their sum and as minus when we are speaking of their difference. The term (rho) in
this equation is the correlation between the two variables (to be discussed in Chapter 9) and is equal to zero when
the variables are independent. (The fact that when the variables are not independent was what forced us to
treat the related sample case separately.)
r± 0
r
s^2 X 16 X 2 =s^21 1s^2262 rs 1 s 2
variance sum
law

Free download pdf