Scientific American - USA (2020-12)

(Antfer) #1
This example illustrates a minefield of cognitive
biases. We prefer information from people we trust,
our in-group. We pay attention to and are more likely
to share information about risks—for Andy, the risk of
losing his job. We search for and remember things that
fit well with what we already know and understand.
These biases are products of our evolutionary past, and
for tens of thousands of years, they served us well. Peo-
ple who behaved in accordance with them—for exam-
ple, by staying away from the overgrown pond bank
where someone said there was a viper—were more
likely to survive than those who did not.
Modern technologies are amplifying these biases
in harmful ways, however. Search engines direct Andy
to sites that inflame his suspicions, and social media
connects him with like-minded people, feeding his
fears. Making matters worse, bots—automated social
media accounts that impersonate humans—enable

misguided or malevolent actors to take advantage of
his vulnerabilities.
Compounding the problem is the proliferation of
online information. Viewing and producing blogs,
videos, tweets and other units of information called
memes has become so cheap and easy that the infor-
mation marketplace is inundated. Unable to process
all this material, we let our cognitive biases decide
what we should pay attention to. These mental short-
cuts influence which information we search for, com-
prehend, remember and repeat to a harmful extent.
The need to understand these cognitive vulnera-
bilities and how algorithms use or manipulate them
has become urgent. At the University of Warwick in
England and at Indiana University Bloomington’s
Observatory on Social Media (OSoMe, pronounced
“awesome”), our teams are using cognitive experiments,
simulations, data mining and artificial intelligence to

56 Scientific American, December 2020

C


onsider Andy, who is worried About contrActing coVid- 19. un Able to reAd
all the articles he sees on it, he relies on trusted friends for tips. When
one opines on Facebook that pandemic fears are overblown, Andy dis-
misses the idea at first. But then the hotel where he works closes its doors,
and with his job at risk, Andy starts wondering how serious the threat
from the new virus really is. No one he knows has died, after all. A col-
league posts an article about the COVID “scare” having been created by
Big Pharma in collusion with corrupt politicians, which jibes with Andy’s distrust of government.
His Web search quickly takes him to articles claiming that COVID-19 is no worse than the flu.
Andy joins an online group of people who have been or fear being laid off and soon finds himself
asking, like many of them, “What pandemic?” When he learns that several of his new friends are
planning to attend a rally demanding an end to lockdowns, he decides to join them. Almost no
one at the massive protest, including him, wears a mask. When his sister asks about the rally,
Andy shares the conviction that has now become part of his identity: COVID is a hoax.

Filippo Menczer is Distinguished Professor of Informatics and Computer
Science and director of the Observatory on Social Media at Indiana University
Bloomington. He studies the spread of disinformation and develops tools for
countering social media manipulation.

Thomas Hills is a professor of psychology and director of the Behavioral
and Data Science master’s program at the University of Warwick in
England. His research addresses the evolution of mind and information.
Free download pdf