Scientific American - USA (2020-12)

(Antfer) #1
December 2020, ScientificAmerican.com 59

were isolated into “social” groups, in which they could
see the preferences of others in their circle but had
no information about outsiders, the choices of indi-
vidual groups rapidly diverged. But the preferences
of “nonsocial” groups, where no one knew about oth-
ers’ choices, stayed relatively stable. In other words,
social groups create a pressure toward conformity so
powerful that it can overcome individual preferenc-
es, and by amplifying random early differences, it can
cause segregated groups to diverge to extremes.
Social media follows a similar dynamic. We con-
fuse popularity with quality and end up copying the
behavior we observe. Experiments on Twitter by
Bjarke Mønsted and his colleagues at the Technical
University of Denmark and the University of South-
ern California indicate that information is transmit-
ted via “complex contagion”: when we are repeated-
ly exposed to an idea, typically from many sources,
we are more likely to adopt and reshare it. This social
bias is further amplified by what psychologists call
the “mere exposure” effect: when people are repeat-
edly exposed to the same stimuli, such as certain fac-
es, they grow to like those stimuli more than those
they have encountered less often.
Such biases translate into an
irresistible urge to pay attention to
information that is going viral—if
everybody else is talking about it,
it must be important. In addition
to showing us items that conform
with our views, social media plat-
forms such as Face book, Twitter,
YouTube and Instagram place pop-
ular content at the top of our screens and show us
how many people have liked and shared something.
Few of us realize that these cues do not provide inde-
pendent assessments of quality.
In fact, programmers who design the algorithms
for ranking memes on social media assume that the
“wisdom of crowds” will quickly identify high-quality
items; they use popularity as a proxy for quality. Our
analysis of vast amounts of anonymous data about
clicks shows that all platforms—social media, search
engines and news sites—preferentially serve up infor-
mation from a narrow subset of popular sources.
To understand why, we modeled how they combine
signals for quality and popularity in their rankings. In
this model, agents with limited attention—those who
see only a given number of items at the top of their
news feeds—are also more likely to click on memes
ranked higher by the platform. Each item has intrin-
sic quality, as well as a level of popularity determined
by how many times it has been clicked on. Another
variable tracks the extent to which the ranking relies
on popularity rather than quality. Simulations of this
model reveal that such algorithmic bias typically sup-
presses the quality of memes even in the absence of
human bias. Even when we want to share the best
information, the algorithms end up misleading us.


ECHO CHAMBERS
most of us do not belieVe we follow the herd. But our
confirmation bias leads us to follow others who are
like us, a dynamic that is sometimes referred to as
homophily—a tendency for like-minded people to
connect with one another. Social media amplifies
homophily by allowing users to alter their social net-
work structures through following, unfriending, and
so on. The result is that people become segregated
into large, dense and increasingly misinformed com-
munities commonly described as echo chambers.
At OSoMe, we explored the emergence of online
echo chambers through another simulation, Echo-
Demo. In this model, each agent has a political opin-
ion represented by a number ranging from −1 (say,
liberal) to +1 (conservative). These inclinations are
reflected in agents’ posts. Agents are also influenced
by the opinions they see in their news feeds, and they
can unfollow users with dissimilar opinions. Starting
with random initial networks and opinions, we found
that the combination of social influence and unfol-
lowing greatly accelerates the formation of polarized
and segregated communities.

Indeed, the political echo chambers on Twitter are
so extreme that individual users’ political leanings
can be predicted with high accuracy: you have the
same opinions as the majority of your connections.
This chambered structure efficiently spreads infor-
mation within a community while insulating that
community from other groups. In 2014 our research
group was targeted by a disinformation campaign
claiming that we were part of a politically motivated
effort to suppress free speech. This false charge
spread virally mostly in the conservative echo cham-
ber, whereas debunking articles by fact-checkers were
found mainly in the liberal community. Sadly, such
segregation of  fake news items from their fact-check
reports is the  norm.
Social media can also increase our negativity. In a
recent laboratory study, Robert Jagiello, also at War-
wick, found that socially shared information not only
bolsters our biases but also becomes more resilient
to correction. He investigated how information is
passed from person to person in a so-called social dif-
fusion chain. In the experiment, the first person in
the chain read a set of articles about either nuclear
power or food additives. The articles were designed
to be balanced, containing as much positive informa-
tion (for example, about less carbon pollution or lon-

Information that passes from person


to person along a chain becomes more


negative and more resistant to correction.

Free download pdf