Scientific American - 09.2019

(Darren Dugan) #1
September 2019, ScientificAmerican.com 93

GRAPHIC BY JEN CHRISTIANSEN; SOURCE:


INFORMATION DISORDER: TOWARD AN INTERDISCIPLINARY FRAMEWORK FOR RESEARCH AND


POLICYMAKING,

BY CLAIRE WARDLE AND HOSSEIN DERAKHSHAN. COUNCIL OF EUROPE, OCTOBER 2017

PARTICIPATING IN THE SOLUTION
IN A HEALTHY INFORMATION commons, people would still
be free to express what they want—but information
that is designed to mislead, incite hatred, reinforce
tribalism or cause physical harm would not be ampli-
fied by algorithms. That means it would not be allowed
to trend on Twitter or in the YouTube content recom-
mender. Nor would it be chosen to appear in Face book
feeds, Red dit searches or top Google results.
Until this amplification problem is resolved, it is pre-
cisely our willingness to share without thinking that
agents of disinformation will use as a weapon. Hence, a
disordered information environment requires that ev-
ery person recognize how he or she, too, can become a
vector in the information wars and develop a set of skills
to navigate communication online as well as oine.
Currently conversations about public awareness
are often focused on media literacy and often with a
paternalistic framing that the public simply needs to
be taught how to be smarter consumers of informa-
tion. Instead online users would be better taught to de-
velop cognitive “muscles” in emotional skepticism and
trained to withstand the onslaught of content de-
signed to trigger base fears and prejudices.
Anyone who uses Web sites that facilitate social in-
teraction would do well to learn how they work—and
especially how algorithms determine what users see by
“prioritiz[ing] posts that spark conversations and mean-
ingful interactions between people,” in the case of a
January 2018 Face book update about its rankings. I
would also recommend that everyone try to buy an ad-
vertisement on Face book at least once. The process of
setting up a campaign helps to drive understanding of
the granularity of information available. You can choose
to target a subcategory of people as specific as women,
aged between 32 and 42, who live in the Raleigh-Dur-
ham area of North Carolina, have preschoolers, have a
graduate degree, are Jewish and like Kamala Harris.
The company even permits you to test these ads in envi-
ronments that allow you to fail privately. These “dark
ads” let organizations target posts at certain people, but
they do not sit on that organization’s main page. This
makes it difficult for researchers or journalists to track
what posts are being targeted at different groups of peo-
ple, which is particularly concerning during elections.
Facebook events are another conduit for manipula-
tion. One of the most alarming examples of foreign in-
terference in a U.S. election was a protest that took place
in Houston, Tex., yet was entirely orchestrated by trolls
based in Russia. They had set up two Face book pages
that looked authentically American. One was named
“Heart of Texas” and supported secession; it created an
“event” for May 21, 2016, labeled “Stop Islamification of
Texas.” The other page, “United Muslims of America,”
advertised its own protest, entitled “Save Islamic Knowl-
edge,” for the exact same time and location. The result
was that two groups of people came out to protest each
other, while the real creators of the protest celebrated
the success at amplifying existing tensions in Houston.

Another popular tactic of disinformation agents is
dubbed “astro turfi ng.” The term was initially connect-
ed to people who wrote fake reviews for products on-
line or tried to make it appear that a fan community
was larger than it really was. Now automated cam-
paigns use bots or the sophisticated coordination of
passionate supporters and paid trolls, or a combina-
tion of both, to make it appear that a person or policy
has considerable grassroots support. By making cer-
tain hash tags trend on Twitter, they hope that particu-
lar messaging will get picked up by the professional
media and direct the amplification to bully specific
people or organizations into silence.
Understanding how each one of us is subject to
such campaigns—and might unwittingly participate in
them—is a crucial first step to fighting back against
those who seek to upend a sense of shared reality. Per-
haps most important, though, accepting how vulnera-
ble our society is to manufactured amplification needs
to be done sensibly and calmly. Fearmongering will
only fuel more conspiracy and continue to drive down
trust in quality-information sources and institutions of
democracy. There are no permanent solutions to weap-
onized narratives. Instead we need to adapt to this new
normal. Just as putting on sunscreen was a habit that
society developed over time and then adjusted as addi-
tional scientific research became available, building re-
siliency against a disordered information environment
needs to be thought about in the same vein.

C ea io
When the message
is designed

  o  c io
When the message is turned
into a media product

Di   i   io
When the product is
pushed out or made public

Reo

cio



HOW DISINFORMATION BECOMES MISINFORMATION
The spread of false or misleading information is often dynamic. It starts when a
disinformation agent engineers a message to cause maximum harm—for example,
my埑 ́Ÿ ́‘àyD ̈ž ̈Ÿ†yÈà¹ïyåïåï›DïÈùï¹ÈÈ¹åŸ ́‘‘à¹ùÈåŸ ́ÈùU ̈Ÿ``¹ ́ŒŸ`ïÎ ́ï›y ́yāïțDåyj
the agent creates “Event” pages on Facebook. The links are pushed out to communities
that might be intrigued. People who see the event are unaware it is a false premise and
share it with their communities, using their own framing. This reproduction continues.

MORE TO EXPLORE
I o  a io Di o  e   o a  a I  e  i ci li a    a e o  o Re ea ch a   olic Mai g.
Claire Wardle and Hossein Derakhshan. Council of Europe, October 2017.
Ne o  o aga  a Ma i  laio  Dii o aio  a  Ra icali aio i A eica  oliic.
Yochai Benkler, Robert Faris and Hal Roberts. Oxford University Press, 2018.
Me e o Mo e e  Ho he  ol  Mo  ial Me ia I Cha gi g  ocial  oe a   o e.
An Xiao Mina. Beacon Press, 2019.
0àŸ®Ÿ ́‘D ́mD§y%yĀåi5›y‡y`ï幆 ̈ŸïyŸå`¹ùàåy¹ ́ÿD ̈ùD ́幆%yĀå$ymŸDÎEmily Van
Duyn and Jessica Collier in Mass Communication and Society, Vol. 22, No. 1, pages 29–48; 2019.
FROM OUR ARCHIVES
Clic   Lie a   i eo a e. Brooke Borel; October 2018.
scientificamerican.com/magazine/sa
Free download pdf