90 Scientific American, September 2019
GRAPHIC BY JEN CHRISTIANSEN; SOURCE:
INFORMATION DISORDER: TOWARD AN INTERDISCIPLINARY FRAMEWORK FOR RESEARCH AND
POLICYMAKING,
BY CLAIRE WARDLE AND HOSSEIN DERAKHSHAN. COUNCIL OF EUROPE, OCTOBER 2017
would not change who we are fundamentally—it could
only map onto existing human characteristics.
Online misinformation has been around since the
mid-1990s. But in 2016 several events made it broadly
clear that darker forces had emerged: automation, mi-
crotargeting and coordination were fueling information
campaigns designed to manipulate public opinion at
scale. Journalists in the Philippines started raising flags
as Rodrigo Duterte rose to power, buoyed by intensive
Facebook activity. This was followed by unexpected re-
sults in the Brexit referendum in June and then the U.S.
presidential election in November—all of which sparked
researchers to systematically investigate the ways in
which information was being used as a weapon.
During the past three years the discussion around
the causes of our polluted information ecosystem has
focused almost entirely on actions taken (or not taken)
by the technology companies. But this fixation is too
simplistic. A complex web of societal shifts is making
people more susceptible to misinformation and con-
spiracy. Trust in institutions is falling because of polit-
ical and economic upheaval, most notably through
ever widening income inequality. The effects of cli-
mate change are becoming more pronounced. Global
migration trends spark concern that communities will
change irrevocably. The rise of automation makes peo-
ple fear for their jobs and their privacy.
Bad actors who want to deepen existing tensions un-
derstand these societal trends, designing content that
they hope will so anger or excite targeted users that the
audience will become the messenger. The goal is that
users will use their own social capital to reinforce and
give credibility to that original message.
Most of this content is designed not to persuade peo-
ple in any particular direction but to cause confusion, to
overwhelm and to undermine trust in democratic insti-
tutions from the electoral system to journalism. And al-
though much is being made about preparing the U.S.
electorate for the 2020 election, misleading and con-
spiratorial content did not begin with the 2016 presi-
dential race, and it will not end after this one. As tools
designed to manipulate and amplify content become
cheaper and more accessible, it will be even easier to
weaponize users as unwitting agents of disinformation.
WEAPONIZING CONTEXT
GENERALLY, THE LANGUAGE used to discuss the misinfor-
mation problem is too simplistic. Effective research
and interventions require clear definitions, yet many
people use the problematic phrase “fake news.” Used
by politicians around the world to attack a free press,
the term is dangerous. Recent research shows that au-
diences increasingly connect it with the mainstream
media. It is often used as a catchall to describe things
that are not the same, including lies, rumors, hoaxes,
misinformation, conspiracies and propaganda, but it
also papers over nuance and complexity. Much of this
content does not even masquerade as news—it ap-
pears as memes, videos and social posts on Face book
and Insta gram.
In February 2017 I created seven types of “informa-
tion disorder” in an attempt to emphasize the spec-
trum of content being used to pollute the information
ecosystem. They included, among others, satire, which
is not intended to cause harm but still has the poten-
tial to fool; fabricated content, which is 100 percent
false and designed to deceive and do harm; and false
context, which is when genuine content is shared with
false contextual information. Later that year technolo-
gy journalist Hossein Derakhshan and I published a
report that mapped out the differentiations among
disinformation, misinformation and malinformation.
Purveyors of disinformation—content that is inten-
tionally false and designed to cause harm—are moti-
vated by three distinct goals: to make money; to have
political influence, either foreign or domestic; and to
cause trouble for the sake of it.
Those who spread mis information—false content
shared by a person who does not realize it is false or
misleading—are driven by sociopsychological factors.
People are performing their identities on social plat-
forms to feel connected to others, whether the “others”
are a political party, parents who do not vaccinate
their children, activists who are concerned about cli-
mate change, or those who belong to a certain religion,
race or ethnic group. Crucially, disinformation can
turn into misinformation when people share disinfor-
mation without realizing it is false.
We added the term “malinformation” to describe
genuine information that is shared with an intent
to cause harm. An example of this is when Russian
agents hacked into e-mails from the Democratic
National Committee and the Hillary Clinton cam-
FFFA
LLLS
EN
ESS
IN
TE
NTT
TTOH
ARM
Mi i o a io
Unintentional mistakes
such as inaccurate
captions, dates,
statistics or translations
or when satire is
taken seriously.
Mali o a io
Deliberate publication
of private information
for personal or
corporate rather than
public interest, such
as revenge porn.
Deliberate change of
context, date or
time of genuine
content.
Di i o a io
Fabricated or
deliberately manipulated
content. Intentionally
created conspiracy
theories or
rumors.
THREE CATEGORIES OF INFORMATION DISORDER
To unde rs t and and s tudy the complexity of the information ecosystem, we need a common
language. The current reliance on simplistic terms such as “fake news” hides important
distinctions and denigrates journalism. It also focuses too much on “true” versus “ fake,”
whereas information disorder comes in many shades of “misleading.”
IN BRIEF
Ma e o i o
aio ioe
exist online, from
fabricated videos
to impersonated ac-
counts to memes de-
signed to manipulate
genuine content.
A o a io a
icoageig
tactics have made
it easier for agents
of disinformation
to weaponize regu-
lar users of the social
web to spread harm-
ful messages.
M ch e ea ch i
eee to under-
åïD ́mïyyy`ïå¹
disinformation and
build safeguards
against it.