Beth Dietz-Uhler
274
The goal of the first assignment in my introductory psychology course is to understand
the scientific method, both in theory and in practice. The module provides basic informa-
tion about the scientific method, most notably the goals, steps, and data-collection meth-
ods. Students then work on an interactive activity requiring them to design (but not carry
out) a study using the five steps of the scientific method (formulate a hypothesis, design
the study, provide a plan for data collection, provide a plan for data analysis, and indicate
likely outlets for reporting the findings). Students type their responses for each of the five
steps in a Web-based form which they then submit to me.
The goal of the second assignment is to understand critical thinking. I give students a
simple definition of critical thinking as well as the characteristics of critical thinkers
(Smith, 1995). Students then engage in an interactive activity in which they consider two
arguments. One argument provides scientific evidence to support the author’s assertion
(e.g., “Evidence shows that people in emergency situations are more likely to receive help
if fewer people are available”); the other relies on personal experience (e.g., “Personal expe-
rience suggests that the more people available to help in an emergency, the more likely one
is to get help”). Students indicate which argument is more convincing to them and state
the reason(s) why the argument they chose is more convincing. Students then submit their
responses online.
Assessment
To assess the effectiveness of the critical thinking module in improving students’ confi-
dence in their use of the scientific method and critical thinking and in their knowledge of
both, students completed a brief assessment instrument that relied on the post-then
method (Howard, 1980; Howard & Dailey, 1979; Koele & Hoogstraten, 1998). Briefly,
in the post-then method (e.g., How much do you know now? and How much did you
know then?), respondents give retrospective pretest ratings. This method eliminates the
response-shift bias in pretest/posttest designs because retrospective evaluations are less
exaggerated than pretest evaluations. The response-shift bias refers to the tendency for
preratings to be elevated, leading to findings of negative, reduced, or nonsignificant treat-
ment effects (Howard, 1980). Using this methodology, students indicated how confident
they were that they could use the scientific method, how confident they were that they
could use critical thinking, how much they knew about the scientific method, and how
much they knew about critical thinking before completing the activity and now using
5-point scales (ranging from 1 = “not very confident/knowledgeable” to 5 = “very confi-
dent/knowledgeable”).
Students in six different introductory psychology courses (n = 60) completed the ques-
tionnaire. Analysis of variance showed no significant differences across the six sections
(average class size = 10 students) on any of the measures, so the samples were combined.
Paired-samples t tests showed significant increases in students’ confidence about using the
scientific method now (M = 4.54, SD = .57) compared with before completing the module
(M = 3.68, SD = 1.06), t(58) = −8.32, p < .001; confidence about using critical thinking