The Washington Post - USA (2021-10-27)

(Antfer) #1

WEDNESDAY, OCTOBER 27 , 2021 .THE WASHINGTON POST EZ RE A21


facebook under fire


BYWILLOREMUS,
CHRISALCANTARA,
JEREMYB.MERRILL
ANDARTURGALOCHA

Facebook’snewsfeed algo-
rithm has been blamed for fan-
ning sectarian hatred,steering
users towardextremism and con-
spiracytheories, andincentiviz-
ing politicians to takemore divi-
sive stands.It’s in the spotlight
thanks to waves of revelations
from theFacebookPapers and
testimonyfromwhistleblower
FrancesHaugen, who argues it’s
at the core of the company’sprob-
lems.
But howexactly does it work,
and whatmakes it so influential?
While the phrase “the algo-
rithm”has taken on sinister,even
mythical overtones, it is,at its
mostbasic level,asystem that
decidesapost’sposition on the
news feed based on predictions
about each user’s preferences and
tendencies.Thedetails of its de-
sign determine whatsorts of con-
tent thrive on the world’s larg est
social network, and whattypes
languish—which in turn shapes
the posts we all create, and the
ways we interactoni ts platform.
Facebook doesn’t release com-
prehensive data on the actual
proportions of posts in anygiven
user’s feed, or onFacebook as a
whole. And each user’s feed is
highly personalized to their be-
haviors. But acombination of
internal Facebook documents,
publicly available information
andconversationswithFacebook
insiders offersaglimpse into how
different approaches to the algo-
rithm can dramatically alter the
categories of content thattend to
flourish.
Thetop postonaFacebook
user’s news feed, shown as the
biggest box, isaprized position
basedonthousandsofdatapoints
related to the user and postitself,
such as the poster,reactions and
comments.
As users scroll farther down
the feed, the smaller boxes here,
the algorithm dictates each post’s
position.Thealgorithm is pre-
cisely tailored to each user but
also reflectsFacebook’sstrategy
to favor certain content or behav-
ior,illustrated in the following
feeds.
Since 2018, the algorithm has
elevated posts thatencouragein-
te raction, such as ones popular
with friends.This broadly priori-
tizes posts by friends and family
and viral memes, but also divisive
content.
This was adeparture from
Facebook’spreviousstrategyin
the mid-2010s, which optimized
for time spent on the site, and
notably gave greater prominence
to clickbait articles and profes-
sionally produced videos.
Each user’s feed reflects their
expressed interests.Forasubset
of extremely partisan users, to-
day’salgorithm can turn their
feeds into echo chambers of divi-
sive content and news, ofvarying
reputability, thatsupporttheir
outlook.
Some critics argueanewsfeed
thatorders posts from newestto
oldestisbetter for society.This
wouldn’t prioritize divisive con-
tent, but could give greater space


to more frequent low-engage-
ment posters, such as thatone
di stant friend withanew baby.
WhenFacebook launched the
News Feed, in 2006, it was pretty
simple.It showed apersonalized
listofactivityupdates from
friends, like“Athalie updated her
profilepicture”and“Jamesjoined
the SanFrancisco, CA network.”
Most were automaticallygenerat-
ed; there was no such thing as a
“post,”justthird-personstatus

updates, like“Ezraisf eelingfine.”
Starting in 2009, arelatively
straightforward ranking algo-
rithm determined the order of
stories for each user,making sure
thatthejuicystuff—likethenews
thatafrien dwas “no longer in a
relationship”—appeared near
the top.
Over the past12years, almost
everything about the news feed
algorithm has changed. But the
principle of putting the juicystuff

at the top—oratleastthe stuff
mostlikelytointerestagivenuser
—has remained.Thealgorithm
has simply grownever more so-
phisticatedtothepointthattoday
it can takeinmore than 10,000
different signals to makeits pre-
dictions aboutauser’s likelihood
of engaging withasinglepost,
according toJason Hirsch, the
company’shead of integritypol-
icy.
Yetthe newsfeed ranking sys-

tem is notatotal mystery. Two
crucial elements are entirely
within the control ofFacebook’s
human employees, and depend
on their ingenuity, their intuition
and ultimately theirvalue judg-
ments.Facebook employees de-
cide whatdatasources the soft-
ware can draw on in making its
predictions. And theydecide
whatits goals should be—thatis,
whatmeasurable outcomes to
maximizefor, and the relative
importance of each.
Troves of internal documents
have offered newinsight into how
Facebook makes those critical de-
cisions, and how it thinks about
and studies the trade-offs in-
volved.The documents—disclo-
sures made to theU.S. Securities
and ExchangeCommission and
provided to Congress in redacted
form byHaugen ’s legal counsel —
were obtained and reviewed by a
consortium of news organiza-
tions, includingTheWashington
Post.Theyhavefocused lawmak-
ers’ attention onFacebook’salgo-
rithm and whether it, and similar
recommendation algorithms on
other platforms, should be regu-
lated.
Defending Facebook’salgo-
rithm, the company’sglobal af-
fairs chief, NickClegg, told ABC’s
“This Week” earlier this month
thatit’slargelyaforce forgood,
and thatremoving algorithmic
rankings would result in “more,
not less” hate speech and misin-
formation in people’s feeds.
In itsearly years,Facebook’s
algorithmprioritizedsignalssuch
as likes, clicks and comments to
decide which posts to amplify.
Publishers,brandsandindividual
users soon learned how to craft
posts and headlines designed to
induce likes and clicks, giving rise
to whatcame to be known as
“clickbait.”By2013, upstartpub-
lishers such as Upworthyand
ViralNovawere amassing tens of
millions of readers with articles
designed specifically to game
Facebook’snewsfeed algorithm.
Facebook realized thatusers
were growing waryofmisleading
teaser headlines, and the compa-
ny recalibrated its algorithm in
2014 and 2015 to downgrade
clickbaitandfocus on newmet-
rics, such as the amount of time a
user spent readingastoryor
watchingavideo ,and incorporat-
ing surveys on whatcontent users
found mostvaluable. Around the
same time, itsexecutives identi-
fied video asabusiness priority,
and used the algorithm to boost
“native”video sshared directly to
Facebook. By the mid-2010s, the
news feed had tilted toward slick,
professionally produced content,
especially videos thatwould hold
people’s attention.
In 2016, however,Facebook ex-
ecutives grewworried about a
decline in“original sharing.” Us-
ers were spending so much time
passively watching and reading
thattheyweren’tinteracting with
each other as much.Young people
in particular shifted their person-
al conversations to rivals such as
Snapchatthatoffered more inti-
macy.
Once again,Facebook found its
answer in the algorithm:It deve l-
opedanewsetofg oalmetricsthat
it called “meaningful social inter-
actions,”designed to show users
more posts from friends and fam-
ily,and fewer from big publishers
and brands.In particular,the al-
gorithm began to give outsize
weight to posts thatsparked lots
of comments and replies.
Thedownside of this approach

was thatthe poststhatsparked
the mostcomments tended to be
the ones thatmade people angry
or offended them, the documents
show.Facebook became an angri-
er,more polarizing place.It didn’t
help that,starting in 2017,the
algorithm had assigned reaction
emoji —including theangry
emoj i—five times the weight of a
simple“like, ”according to com-
panydocuments.
“The goal of theMeaningful
Social Interactions ranking
changeisinthe name: improve
people’s experience by prioritiz-
ing posts thatinspire interac-
tions, particularly conversations,
between family and friends,”
FacebookspokesmanAdamIsser-
lis said. “We’re continuing to
make changes consistent with
this goal, likenew tests to reduce
political content on Facebook
base donresearch and feedback.”
While the choices behindFace-
book’snewsfeed algorithm can
broadly elevatecertaintypes of
content, the same algorithm will
produce different results forev-
eryuser,becauseitisbuilttolearn
from their individual behaviors.
If yourarely click on videos in
your feed, you’llbefar less likely
to seeaviral video than your
friend who loves videos. If you
spend mostofy our time interact-
ing withFacebook Groups, posts
from those groups willfigure es-
pecially prominently in your feed.
Internaldocuments show
Facebook researchers found that,
for the mostpolitically oriented 1
million American users,nearly 90
percent of the content thatFace-
book shows them is about politics
and social issues.Those groups
also received the mostmisinfor-
mation, especiallyaset of users
associated with mostly right-
leaning content, who were shown
one misinformation postout of
every40, according toadocu-
ment fromJune 2020.
One takeawayisthatFace-
book ’s algorithm isn’t arunaway
train.Thecompanymay not di-
rectly control whatany given user
posts, but by choosing which
types of posts will be seen, it
sculpts the information land-
scap eaccordingtoits business
priorities. Some within the com-
panywould like to seeFacebook
use the algorithm toexplicitly
promote certainvalues, such as
democracyand civil discourse.
Others have suggested thatitde-
velop and prioritize newmetrics
thatalign with users’values, as
witha2020 experiment in which
the algorithm was trained to pre-
dictwhatposts theywouldfind
“good for the world”and “bad for
the world,”and optimize for the
former.
Still others, includingHaugen,
would liketosee Facebook’spow-
er over the algorithm takenaway
altogether: They argu ewe’d all be
better of fwithsocial media feeds
thatsimpl yshowed us all of our
friends’ posts in reverse-chrono-
logical order.But even thatwould
come withtrade-offs:Theusers
and institutions thatpostmost
frequently,withthe largestexist-
ing audiences, would dominate
our feeds, while worthyideas and
clever videos from those with
smaller followings would have
lessofachanc eofr eachingpeople
who might be interested.
[email protected]
[email protected]
[email protected]
[email protected]

Kate Rabinowitz contributed to this
report.

How social media giant’s algorithm shapes our feeds


THEWASHINGTON POST

Facebook’sNewsFeed algorithm is precisely tailored to each user but alsoreflects the company’sstrategyto
favor certain content or behavior,illustrated in the followingfeeds.

Some critics argueanewsfeed that
ordersposts from newest to oldest is
better for society.This wouldn’t
prioritizedivisivecontentbut could
give greater space to morefrequent
low-engagement posters, such as that
one distant friendwithanew baby.

Chronological

And again.

Same baby Thatbaby, again.

Paul N. and 568others360 comments
Like Comment

Acquaintance’snewbaby

Since 2018, the algorithm has elevated
posts that encourage interaction, such
as ones popular with friends. This
broadly prioritizes postsby
fr iends and familyandviral memes,
but alsodivisivecontent.

Meaningful social interactions Trip photos

Paul N. and 568 others360 comments
Like Comment
Vaccination?

Show your age

Paul N. and 568others360 comments
Like Comment

ThiswasadeparturefromFacebook’s
previous strategyinthe mid-2010s,
which optimized for time spent on the
site and notably gave greater
prominence toclickbait articlesand
professionallyproduced videos.

Time spent Best apple pie
recipe

This one crazy
trick

Cats!Cats! And morecats!

Paul N. and 568others360 comments
Like Comment

Each user’s feedreflects their
expressed interests.Forasubset of
extremely partisan users, today’s
algorithm can turn theirfeeds into
echo chambersofdivisivecontent
andnews,ofvaryingreputability,that
supporttheir outlook.

Extreme partisanship Spicyopinion
piece

Conspiracy
theory

We’reengaged

Paul N. and 568others360 comments

Whythey’rewrong?

Like Comment

The top post onaFacebook user’s
newsfeed isaprized position based on
thousands of data pointsrelated to the
user and post itself,suchasthe poster,
reactions and comments.

Post title

Paul N. and 568others360 comments
Like Comment

Size based
onrank in
NewsFeed

As usersscroll farther down thefeed,
the algorithm dictates posts'position.

Aglimpse into what
posts get top billing and
what gets obscured

Be oneof the first200people

Interested?Call us today to scheduleyour appointment!

(( 220022 )) 881133 - - 99669900

Benefitsof having better hearing
ACTNOW!
We arelookingfor200 peopleto try the new activeand tell us what
they think about them.

Alsoreceive:

•FreeHearing Screening

•2-Week FREETrial

•NoRisk, No Obligation

SigniaActivehearing
aids deliver clinically
proven better
hearing in noisy
situations,through
amodern, easy
earbud.

•The hearing aid that looks like an ear bud

•Recharge on the go
•AIdigital assistant
•Bluetooth streaming

*Some restrictionsappl y. Mus th av eh earingloss to
receiveg if tc ard. One giftc ard per customer,per 12 -month

Increased self-awareness

Increased saftey

Increased independence

Conversation freedom

$ 25

Qualifying participants

will receiveaFREE

Gift Card*.
Free download pdf