Rotman Management — Spring 2017

(coco) #1

42 / Rotman Management Spring 2017


carried out an experiment in which participants were asked
to write four truths and one lie about themselves. Participants
were then randomly assigned to one of three conditions that
varied the amount of information they believed was about an-
other study participant: no information; seeing one truth about
this other person; or seeing four truths. They were then asked
how likely it was that the other person could detect the partici-
pant’s own lie.
Participants who were given information about the other
person reported a higher probability that the other person
would detect their lie (approximately 40 per cent versus 27 per
cent in the ‘no information’ condition) — that is, they experi-
enced a greater illusion of transparency. In some sense, this
assumption is adaptive, given most situations we face. It is usu-
ally true that people whom we know well also know us well.


But, in our experiment, this could not possibly be true. Instead,
people over-generalized this belief.
This insight also has implications for law enforcement,
which often focuses on solving and deterring crime by extract-
ing information from the public. The illusion of transparency
suggests that we may be able to deter crime more effectively by
providing information to the public. That might take the form
of local beat officers simply sharing a few benign details about
their lives at a community meeting or out on patrol.
More broadly, we suspect that increasing the illusion of
transparency will be most effective at changing behaviour
when specific individuals are responsible for enforcement,
but are not well known. For example, students who skip school
may not know the attendance clerks and truancy officers
tracking absences; people who are filing their taxes may not

Labeling a person’s behaviour as ‘already commendable’ can
be more effective than asking them to change their behaviour.

Carla O’Dell: Why don’t we make better decisions?

Chip Heath: Psychologists have spent a number of
years studying all of the biases we have, and they’ve
found that there are a number of basic ways we think
about the world that lead us to the wrong conclusions.
For example, we frame decisions narrowly. The typical
person only thinks of one alternative when making a decision.

How can we think more broadly and give ourselves multiple
options in the decision-making process?

The trick is to force yourself to come up with a second alternative,
which is generally not hard, once you discipline yourself. One of the
best tricks we’ve learned is what we call ‘the vanishing options test’.
Think to yourself, “What if the option I’m thinking about right now
suddenly disappeared? What else could I do?”

When you don’t force yourself to do this, your mental spotlight keeps
focusing on one option and whether or not you should do that one thing.
When we do the vanishing options test with people, we find that about
80 per cent of the time, they come up with something much better than
they had initially thought of within three minutes—even if they had been
agonizing about the decision for weeks before they did the test.

Some of the hardest decisions we make are those that involve
loss and letting go. For example, discontinuing a product line,
or getting a divorce. What can we do to make these kinds of
decisions easier?

There is a very basic principle called ‘loss aversion’, which means that
people would rather avoid loss than acquire gains. Daniel Kahneman
and Amos Tversky found that losses are two-to-four times more painful
to our brains than equivalent gains are pleasurable. Very often, when
we’re going into a new situation at work, we have to give up something in
order to get something that’s (hopefully) better. But empirically, the thing
that’s coming has to be two to four times as good to be in the ballpark of
not regretting the loss. Loss aversion is a recipe for many problems in our
personal and professional lives.

Apart from loss aversion, what other cognitive bias undermines
our decision making the most?

The biggest bias I see — and one of the harder ones to change—is
confirmation bias. When we go into a situation, we’re collecting data about
options—things we might do or consider. Confirmation bias is our ten-
dency to collect data in a way that is biased by the hypotheses we walk
in with. Say you love Thai food, and a new Thai restaurant opens in town.
You may browse reviews of the new restaurant and feel that you’re

The Path to Better Decisions
Stanford professor and best-selling author Chip Heath tells Carla O’Dell why so many decisions are flawed
Free download pdf