ScAm - 09.2019

(vip2019) #1
8 Scientific American, September 2019

SCIENCE AGENDA
OPINION AND ANALYSIS FROM
SCIENTIFIC AMERICAN’S BOARD OF EDITORS

Illustration by Aad Goudappel

When “Like”


Is a Weapon


Everyone is an agent in
the new information warfare

By the Editors

No one thinks, I am the kind of person who is susceptible to
misinformation. It is those others (stupid anti-vaxxers! arrogant
liberal elites!) who are swayed by propaganda masquerading as
news and bot armies pushing partisan agendas on Twitter.
But recent disinformation campaigns—especially ones that
originate with coordinated agencies in Russia or China—have
been far more sweeping and insidious. Using memes, manipulat-
ed videos and impersonations to spark outrage and confusion,
their targets transcend any single election or community. Indeed,
these efforts aim to engineer volatility to undermine democracy
itself. If we’re all mentally exhausted and we disagree about what
is true, then authoritarian networks can more effectively push
their version of reality. Playing into the “us versus them” dynam-
ic makes everyone more vulnerable to false belief.
Instead of surrendering to the idea of a post-truth world, we
must recognize this so-called information disorder as an urgent
societal crisis and bring rigorous, interdisciplinary scientific
research to combat the problem. We need to understand the trans-
mission of knowledge online; the origins, motivations and tactics
of disinformation networks, both foreign and domestic; and
exactly how even the most educated evidence seekers can unwit-
tingly become part of an influence operation. Little is known, for
instance, about the effects of long-term exposure to disinforma-
tion or how it affects our brain or voting behavior. To examine
these connections, technology behemoths such as Face book,
Twitter and Google must make more of their data available to
independent researchers (while protecting user privacy).
The pace of research must try to catch up with the rapidly grow-
ing sophistication of disinformation strategies. One positive step
will be the launch this winter of The Misinformation Review, a
multimedia-format journal from Harvard University’s John F. Ken-
nedy School of Government that will fast-track its peer-review pro-
cess and prioritize articles about real-world implications of misin-
formation in areas such as the media, public health and elections.
Journalists must be trained in how to cover deception so that
they don’t inadvertently entrench it, and governments should
strengthen their information agencies to fight back. Western
nations can look to the Baltic states to learn some of the innova-
tive ways their citizens have dealt with disinformation over the
past decade: for example, volunteer armies of civilian “elves”
expose the methods of Kremlin “trolls.” Minority and historical-
ly oppressed communities are also familiar with ways to push
back on authorities’ attempts to overwrite truth. Critically, tech-
nologists should collaborate with social scientists to propose

interventions—and they would be wise to imagine how attack-
ers might cripple these tools or turn them around to use for
their own means.
Ultimately, though, for most disinformation operations to suc-
ceed, it is regular users of the social Web who must share the vid-
eos, use the hashtags and add to the inflammatory comment
threads. That means each one of us is a node on the battlefield for
reality. Individuals need to be more aware of how our emotions
and biases can be exploited with precision and consider what
forces might be provoking us to amplify divisive messages.
So every time you want to “like” or share a piece of content,
imagine a tiny “pause” button hovering over the thumbs-up icon
on Facebook or the retweet symbol on Twitter. Hit it and ask
yourself, Am I responding to a meme meant to brand me as a par-
tisan on a given issue? Have I actually read the article, or am I
simply reacting to an amusing or enraging headline? Am I shar-
ing this piece of information only to display my identity for my
audience of friends and peers, to get validation through likes? If
so, what groups might be microtargeting me through my con-
sumer data, political preferences and past behavior to manipu-
late me with content that resonates strongly?
Even if—especially if—you’re passionately aligned with or dis-
gusted by the premise of a meme, ask yourself if sharing it is
worth the risk of becoming a messenger for disinformation meant
to divide people who might otherwise have much in common.
It is easy to assume that memes are innocuous entertain-
ment, not powerful narrative weapons in a battle between democ-
racy and authoritarianism. But these are among the tools of the
new global information wars, and they will only evolve as ma -
chine learning advances. If researchers can figure out what
would get people to take a reflective pause, it may be one of the
most effective ways to safeguard public discourse and reclaim
freedom of thought.

JOIN THE CONVERSATION ONLINE
Visit 2_w²íˆ_Ĉ¬wޝ_C² on Facebook and Twitter
or send a letter to the editor: [email protected]
Free download pdf