Statistical Methods for Psychology

(Michael S) #1
(Such knowledge is so rare that it is not even worth imagining cases in which we would
have it, although a few do exist.) We can circumvent this problem just as we did in the one-
sample case, by using the sample variances as estimates of the population variances. This,
for the same reasons discussed earlier for the one-sample t, means that the result will be
distributed as t rather than z.

Since the null hypothesis is generally the hypothesis that we will drop that
term from the equation and write

Pooling Variances


Although the equation for t that we have just developed is appropriate when the sample
sizes are equal, it requires some modification when the sample sizes are unequal. This
modification is designed to improve the estimate of the population variance. One of the
assumptions required in the use of t for two independent samples is that (i.e.,
the samples come from populations with equal variances, regardless of the truth or fal-
sity of ). The assumption is required regardless of whether and are equal. Such
an assumption is often reasonable. We frequently begin an experiment with two groups
of subjects who are equivalent and then do something to one (or both) group(s) that will
raise or lower the scores by an amount equal to the effect of the experimental treatment.
In such a case, it often makes sense to assume that the variances will remain unaffected.
(Recall that adding or subtracting a constant—here, the treatment effect—to or from a
set of scores has no effect on its variance.) Since the population variances are assumed to
be equal, this common variance can be represented by the symbol , without a
subscript.
In our data we have two estimates of , namely and. It seems appropriate to ob-
tain some sort of an average of and on the grounds that this average should be a better
estimate of than either of the two separate estimates. We do not want to take the simple
arithmetic mean, however, because doing so would give equal weight to the two estimates,
even if one were based on considerably more observations. What we want is a weighted
average,in which the sample variances are weighted by their degrees of freedom ( ).
If we call this new estimate then

The numerator represents the sum of the variances, each weighted by their degrees of free-
dom, and the denominator represents the sum of the weights or, equivalently, the degrees
of freedom for.
The weighted average of the two sample variances is usually referred to as a pooled
variance estimate.Having defined the pooled estimate ( ), we can now writes^2 p

s^2 p

s^2 p=

(n 12 1)s^211 (n 22 1)s^22
n 11 n 222

s^2 p

ni 21

s^2

s^21 s^22

s^2 s^21 s^22

s^2

H 0 n 1 n 2

s^21 =s^22

t=

(X 12 X 2 )


sX 12 X 2

=


(X 12 X 2 )


B


s^21
n 1

1


s^22
n 2

m 1 2m 2 =0,

=


(X 12 X 2 ) 2 (m 1 2m 2 )

B


s^21
n 1

1


s^22
n 2

t=

(X 12 X 2 ) 2 (m 1 2m 2 )
sX 12 X 2

206 Chapter 7 Hypothesis Tests Applied to Means


weighted
average


pooled variance
estimate

Free download pdf