Invitation to Psychology

(Barry) #1
Chapter 1 What Is Psychology? 15

a hypothesis in such a way that it can be refuted,
or disproved by counterevidence. This principle,
known as the principle of falsifiability, does not
mean that the hypothesis will be disproved, only
that it could be if contrary evidence were to be
discovered.
Another way of saying this is that a scientist
must risk disconfirmation by predicting not only
what will happen, but also what will not happen
if the hypothesis is correct. In the misery-loves-
company study, the hypothesis would be sup-
ported if most anxious people sought each other
out, but disconfirmed if most anxious people
went off alone to sulk and worry, or if anxiety had
no effect on their behavior (see Figure 1.1 on the
next page). A willingness to risk disconfirmation
forces scientists to take negative evidence seri-
ously and to abandon mistaken hypotheses.
The principle of falsifiability is often violated
in everyday life because all of us are vulnerable to
the confirmation bias: the tendency to look for and
accept evidence that supports our pet theories and
assumptions and to ignore or reject evidence that
contradicts our beliefs. Thus, if a police interroga-
tor is convinced of a suspect’s guilt, he or she may
interpret anything the suspect says, even the per-
son’s maintenance of innocence, as confirming evi-
dence that the suspect is guilty (“Of course he says
he’s innocent; he’s a liar”) (Leo, 2008). But what if
the suspect is innocent? The principle of falsifiabil-
ity compels scientists, and the rest of us, to resist the
confirmation bias and to consider counterevidence.
Watch the Video Confirmation Bias
at mypsychlab

5


Avoid Emotional Reasoning. Emotion has a place
in critical thinking and in science. Passionate
commitment to a view motivates people to think
boldly, to defend unpopular ideas, and to seek
evidence for creative new theories. But emotional
conviction alone cannot settle arguments, and in
fact it usually makes them worse. The fact that
you really, really feel strongly that something is
true—or want it to be—doesn’t make it so.
All of us are apt to feel threatened and get
defensive whenever our most cherished beliefs, or
commitment to a course of action, are challenged
by empirical evidence (Tavris & Aronson, 2007).
At such times, it is especially important to separate
the data from emotional reasoning. In our opening
news story about the ruling that vaccines do not
cause autism, one of the judges expressed sympathy
for the parents, but added, “I must decide this case
not on sentiment, but by analyzing the evidence.”
You probably hold strong beliefs about drug
use, the causes of crime, the origins of intel-
ligence, gender differences, obesity, and many

principle of falsifi-
ability The principle that
a scientific theory must
make predictions that are
specific enough to expose
the theory to the possibil-
ity of disconfirmation;
that is, the theory must
predict not only what will
happen but also what will
not happen.

confirmation bias The
tendency to look for or
pay attention only to
information that confirms
one’s own belief, and
ignore, trivialize, or forget
information that discon-
firms that belief.

In scientific research, an idea may initially
generate excitement because it is plausible, imag-
inative, or appealing, but eventually it must be
backed by empirical evidence if it is to be taken
seriously. A collection of anecdotes or an appeal
to authority will not do. Sometimes, of course,
checking the reliability of the evidence directly
is not practical. In those cases, critical thinkers
consider whether it came from a reliable source.
Sources who are reliable exercise critical think-
ing themselves. They usually have education
or experience in the field in which they claim
expertise. They do not pressure people to agree
with them. They share their evidence openly.
They draw on research that has been reviewed
by other experts on the subject, rather than
merely announced to the public in a press release
or blog.


4


Analyze Assumptions and Biases. Assumptions are
beliefs that are taken for granted, and biases are
assumptions that keep us from considering the evi-
dence fairly or that cause us to ignore the evidence
entirely. Critical thinkers try to identify and evalu-
ate the unspoken assumptions on which claims
and arguments may rest—in the books they read,
the political speeches they hear, and the advertise-
ments that bombard them every day. In science,
some of the greatest scientific advances have been
made by those who dared to doubt widespread as-
sumptions: that the sun revolves around the earth,
that illness can be cured by applying leeches to the
skin, that madness is a sign of demonic possession.
Critical thinkers are willing to analyze and
test not only other people’s assumptions, but also
their own, which is much harder. Researchers
put their own assumptions to the test by stating


When demonstrating “levitation” and other suppos-
edly magical phenomena, illusionists exploit people’s
tendency to trust the evidence of their own eyes even
when such evidence is misleading. Critical thinkers ask
about the nature and reliability of the evidence afor a
phenomenon.

Free download pdf