The Washington Post - USA (2022-05-15)

(Antfer) #1

G4 EZ EE THE WASHINGTON POST.SUNDAY, MAY 15 , 2022


not to mention we the users —
can’t see inside their black box
for ourselves.
“We the users deserve trans-
parency,” says Haugen. “We de-
serve to have the same level of
nutritional labeling for our infor-
mational products as we have for
our nutritional products. We de-
serve to see what goes into the
algorithms. We deserve to see
what the consequences of those
things are. And right now, we’re
forced to just trust Facebook.”
It’s more than an academic is-
sue. “Platforms get to experi-
ment on their users all the time
without letting them know ex-
periments are going on,” says
Laura Edelson, a researcher at
New York University whose Face-
book account was cut off by the
company for studying political
advertisements and misinforma-
tion. “Consumers deserve no-
tice.”
In the U.S., at least five bills
have been introduced in Con-
gress that focus on responsibility
for algorithms. One of them is
the bipartisan Platform Account-
ability and Transparency Act
(PATA), which would force com-
panies to open up their algo-
rithms by turning over informa-
tion about how they work — and
their consequences — to re-
searchers and the public.
“We agree people should have
control over what they see on
our apps and we’ll continue
working on new ways to make
them more transparent, while
also supporting regulation that
sets clear standards for our in-
dustry in this area,” said Otway,
the spokeswoman for Instagram.
Now it’s time to hold them to
it.

dations. Instead, they should fo-
cus on the signals where we ex-
plicitly say we’re interested, such
as pressing like or following an
account.
Apps also need to provide us
better ways to give negative feed-
back on their algorithmic choic-
es. Right now it’s too hard to tell
Instagram or Facebook you don’t
want something. It could move a
“no thank you” button out from
behind the menu screen, and to
right next to the Like button.
Instagram is starting down
this path. It tells me it is at the
early stages of exploring a con-
trol that would allow people to
select keywords to filter from
their recommendations. To use
their example, if you asked that
the word “bread” be removed
from your recommendations, In-
stagram wouldn’t show posts
containing the word “bread.”
I’m also intrigued by a bolder
idea: Let us choose between
competing algorithms to order
the information on our feeds. Al-
gorithms can be programmed to
show or bury content. Some peo-
ple might want to see Donald
Trump, while others might want
feeds that are completely poli-
tics-free. It could work kind of
like the app store on your phone.
Different algorithm developers
could compete to organize your
Instagram, Facebook or Twitter
feed, and you settle on the one
you like the best. Or maybe you
switch from time to time, de-
pending on your mood.
These are all product fixes, but
bigger solutions have to address
another problem: We actually
know very little about how these
algorithms work. Right now, re-
searchers and governments —

information.
On my son’s account, I wit-
nessed another unintended con-
sequence: what Haugen calls
“engagement hackers.” They’re a
kind of spammer who has
learned how to hijack Insta-
gram’s logic, which encourages
them to post shocking images to
elicit reactions from viewers and
thus build their credibility with
the algorithm.
Several of the accounts behind
the images Instagram recom-
mended to my son’s appear not
to be parents of the children fea-
tured in the images. One image
I’ve seen repeatedly, of a baby
with what appear to be severe lip
blisters, was shared by accounts
called kids_past (with 117,000
followers) and another called
cutes.babiesz (with 32,000 fol-
lowers). The captions on the
photos don’t make sense with
image, and don’t appear to be re-
lated to the other children fea-
tured on the account. Both also
suggest in their biographies that
they’re available for paid promo-
tions. Neither account replied to
messages asking where it had
gotten the blister image.
Instagram doesn’t completely
throw caution to the wind. It has
community standards for con-
tent, including guidelines on
what kinds of themes can be in-
cluded in posts that its algo-
rithms recommend. It says con-
tent that’s either “clickbait” or
“engagement bait” is not al-
lowed. In April the company an-
nounced a new effort to down-
rank content that is not “origi-
nal.”
Haugen says Facebook doesn’t


FOWLER FROM G3


that category.
An even better idea: Give us
an algorithmic reset button. I
understand many people really
enjoy social media recommenda-
tions, especially on TikTok. So
give us the power to clear what
the algorithm thinks about us
without deleting the whole ac-
count and losing our friends, just
like you can clear your history
and cookies in a Web browser.
To give users more control,
apps could also stop using un-
conscious actions — like dwell
time while you’re doomscrolling
— as signals to feed recommen-

Instagram declined to let me
speak with Mosseri for this col-
umn. Let’s hope he’s open to
feedback.
Here’s a start: Let us just turn
off algorithms. In March, Insta-
gram announced it would bring
back a version of its main feed
that sorts posts in reverse chron-
ological order. That’s good. But
to completely shut off Insta-
gram’s recommended posts from
accounts you don’t follow — and
make at least your main feed a
friends-only experience — you
have to select the Favorites-only
view, and put all your friends in

have leadership that can ask
hard questions about its impact
— and accept hard answers.
“When you acknowledge power,
you also then acknowledge re-
sponsibility. And Facebook
doesn’t want to allocate any
more time to features that don’t
cause it to grow.”

How to make algorithms
accountable
So how can we the users take
back power over algorithms?
From researchers and lawmakers
alike, there’s a growing collec-
tion of good ideas.

JABIN BOTSFORD/THE WASHINGTON POST
Facebook whistleblower Frances Haugen, seen during President Biden’s State of the Union address this
year, says that people deserve transparency about the algorithms in our social media platforms.

ballots are fraudulent — the first
action by a technology company to
punish Trump for spreading mis-
information. Days later, the com-
pany acted again, covering up a
Trump tweet about protests over
the death of George Floyd that
warned “when the looting starts,
the shooting starts.” More such
actions followed.
Later that year, Gadde was in-
volved in a decision that drew
widespread criticism. In October
2020, the New York Post published
an exclusive story based on ma-
terial found on a laptop allegedly
belonging to Biden’s son Hunter.
Gadde and other trust and safety
executives suspected the story was
based on material obtained
through hacking and therefore vi-
olated the company’s rules against
publishing such material.
Anxious to avoid a repeat of
Russia leaking hacked material
during the 2016 election, Twitter
executives took the unusual step
of temporarily locking the news-
paper’s Twitter account and block-
ing Twitter users from sharing a
link to the story.
Even within liberal Twitter, the
decision was controversial, two of
the people said. It was not entirely
clear the materials had been
hacked, nor that the New York
Post had participated in any hack-
ing. A Post investigation later con-
firmed that thousands of emails
taken from the laptop were au-
thentic.
Amid mounting outrage among
conservatives, Gadde conferred
with Dorsey and announced an
11th-hour change to the hacked-
m aterials policy: The company
would remove only content posted
by the hackers themselves or oth-
ers acting in concert with them. It
also would label more question-
able tweets.
Dorsey later tweeted that the
decision to block mention of the
New York Post story was a mis-
take. Recently, Musk tweeted that
“suspending the Twitter account
of a major news organization for
publishing a truthful story was
obviously incredibly inappropri-
ate.”
Now employees are worried
Musk will undo much of the trust
and safety team’s work. Many peo-
ple silenced by policies adopted
under Gadde are clamoring for
Musk to avenge them. Johnson,
for example, said he has appealed
via text to Jared Birchall, head of
Musk’s family office, asking when
his account might be restored.
Birchall did not immediately
respond to a request for comment.
Though Johnson does not plan
to tweet, he said, he wants his
account back on principle. Ac-
cording to text messages first re-
ported by the Wall Street Journal
and subsequently viewed by The
Post, Birchall replied: “Hopefully
soon.”
Birchall also shed light on one
of the biggest questions looming
over the Musk takeover: Will
Musk undo Gadde’s decision to
ban Trump? At a recent TED con-
ference, Musk said he supports
temporary bans over permanent
ones.
Musk “vehemently disagrees
with censoring,” Birchall texted to
Johnson. “Especially for a sitting
president. Insane.”

A year later, nearly every other
major platform banned Jones.
Twitter initially declined to do so,
saying Jones hadn’t broken any of
its rules. Within a month, howev-
er, Gadde reversed course, banish-
ing Jones for “abusive behavior.”
In a 2019 appearance on the “Joe
Rogan Experience” podcast, Gad-
de explained that Jones had
earned “three strikes” by posting
videos that did violate Twitter’s
rules, including one she deemed
an incitement to violence against
the news media.
Jones did not respond to a re-
quest for comment. At the time, he
called Infowars “a rallying cry for
free speech in America,” adding
that he was “very honored to be
under attack.”
Gadde and her team later esca-
lated the company’s efforts to fight
disinformation — along with
spam and fake accounts — after
news broke that Twitter, Facebook
and other platforms had been ex-
ploited by Russian operatives dur-
ing the 2016 campaign. The com-
pany began removing a million
accounts a day in a broad effort to
crack down on abuse.
In a move described as signa-
ture Gadde, Twitter also launched
an initiative called “Healthy Con-
versations” that sought feedback
from hundreds of experts about
how to foster more civil dialogue.
That effort led to updated hate
speech policies that banned “de-
humanizing speech” — such as
racial slurs and negative stereo-
types based on religion, caste or
sexual orientation — because it
could have the effect of “normaliz-
ing serious violence,” according to
a company blog post.
In subsequent years, Dorsey be-
came increasingly absent and
would effectively outsource a
growing number of decisions to
Gadde, including those around
content moderation, three of the
people said.
Gadde also was key to a 2019
decision to ban political advertis-
ing on the platform, according to
four people familiar with the deci-
sion, arguing that politicians
should reach broad audiences on
the merits of their statements
rather than by paying for them.
Other companies copied the
move, enacting temporary pauses
during the 2020 election.
Throughout Trump’s presiden-
cy, at the company’s monthly town
halls, Twitter employees regularly
called on Gadde to ban Trump,
accusing him of bullying and pro-
moting misinformation. Gadde
argued that the public had a right
to hear what public figures such as
Trump have to say — especially
when they say horrible things, the
people said.
Meanwhile, Gadde and her
team were working with engi-
neers to develop a warning label to
cover up tweets — even from
world leaders such as Trump — if
they broke the company’s rules.
Users would see the tweet only if
they chose to click on it. They saw
it as a middle ground between
banning accounts and removing
content and leaving it up.
In May 2020, as Trump’s reelec-
tion campaign got underway,
Twitter decided to slap a fact-
checking label on a Trump tweet
that falsely claimed that mail-in

to his 2016 presidential campaign.
With nearly 90 million followers
at his peak, Trump routinely
lobbed tweets at political oppo-
nents, journalists and even private
citizens, triggering waves of on-
line harassment.
After Trump’s election, Gadde
and Dorsey convened a “free
speech roundtable” at the compa-
ny’s San Francisco headquarters,
where top Twitter executives
heard from Citron, former New
York Times editor Bill Keller and
Tom Goldstein, former dean of the
graduate journalism school at
University of California at Berke-
ley. During the meeting, which has
not been previously reported, Cit-
ron expressed concerns about on-
line harassment, especially direct-
ed at journalists.

Gadde “understood how speech
could silence speech,” Citron re-
called, “and could be incredibly
damaging to people’s lives.”
Goldstein declined to comment
on the meeting. Keller said the
group discussed how new stan-
dards could bring order to the
“wild west” of social media.
Internally, some employees
faulted Gadde for ineffectiveness,
as rules were unevenly applied
across the massive platform.
Three former workers said her
trust and safety unit did not coor-
dinate well with other teams that
also policed the site.
Even as the company took ac-
tion to limit hate speech and ha-
rassment, Gadde resisted calls to
police mere misinformation and
falsehoods — including by the new
president.
“As much as we and many of the
individuals might have deeply
held beliefs about what is true and
what is factual and what’s appro-
priate, we felt that we should not
as a company be in the position of
verifying truth,” Gadde said on a
2018 Slate podcast, responding to
a question about right-wing me-
dia host Alex Jones, who had pro-
moted the falsehood on his show,
Infowars, that the Sandy Hook
school shooting was staged.

for the company, said software
engineer Brianna Wu, one of the
women targeted in GamerGate,
who worked with Twitter to im-
prove the site.
In an op-ed published in The
Post, Gadde wrote that she was
“seriously troubled by the plight of
some of our users who are com-
pletely overwhelmed by those
who are trying to silence healthy
discourse in the name of free ex-
pression.”
By then, Gadde had been pro-
moted to general counsel, oversee-
ing all legal and trust and safety
matters facing the company.
In response to GamerGate,
Twitter streamlined the compa-
ny’s complicated nine-step proc-
ess for reporting abuse and tripled
the number of people on its trust

and safety team, as well as other
teams that protect users, accord-
ing to the op-ed and other reports
at the time.
But the moves to clamp down
on harassment soon stirred fresh
controversy. Internal emails ob-
tained by BuzzFeed in 2017
showed Gadde and other execu-
tives engaged in messy, seemingly
ad hoc deliberations over whether
to shut down the accounts of alt-
right provocateur Milo Yiannopo-
ulos and right-wing flamethrower
Chuck C. Johnson, who had tweet-
ed that he was raising money in
the hopes of “taking out” a leader
of the Black Lives Matter move-
ment.
Johnson, who says his com-
ment was part of a “journalistic
project,” has complained that
Twitter never offered a clear rea-
son for the ban. He sued the com-
pany over it and lost. He has since
abandoned his alliance with
Trump and declared his support
for President Biden, he said, lead-
ing to attacks online. Because his
Twitter account is still suspended,
Johnson argues he is unable to
defend himself.
About the same time, Twitter
was confronted with another co-
nundrum: the candidacy of
Trump, who made Twitter central

crossing over into a place where
you’re preventing someone else
from using their voice.”
Gadde is a previous donor to
Kamala D. Harris and other Dem-
ocrats, and in 2017 she helped lead
Twitter’s $1.59 million donation to
the ACLU to fight Trump’s execu-
tive order banning immigration
from majority Muslim countries.
Among employees, Gadde is
known for taking a legalistic yet
pragmatic approach to content
moderation. As with Trump after
the Jan. 6 insurrection, she often
has argued against limiting
speech and has rejected col-
leagues who wanted to take a
stronger approach to removing
content, moving to do so only after
careful consideration.
For years, she has been the ani-
mating force pushing Twitter to
champion free expression abroad.
In India and Turkey, for example,
her team has resisted demands to
remove content critical of repres-
sive governments. In 2014, Gadde
made Twitter the only Silicon Val-
ley company to sue the U.S. gov-
ernment over gag orders on what
tech companies could say publicly
about federal requests for user
data related to national security.
(Five other companies settled.)
“She wasn’t a censorship war-
rior or a free expression warrior,”
said a former colleague familiar
with Gadde’s approach. “She is
pragmatic but not doctrinaire.”
A dedication to free speech has
been part of Twitter’s DNA since
its founding in San Francisco 16
years ago. Early executives were
such believers that they famously
referred to Twitter as “the free
speech wing of the free speech
party.” That approach made Twit-
ter ripe for abuse in its early days,
and the platform developed a rep-
utation as unsafe — particularly
for high-profile women, who en-
dured threats of rape and other
sexist attacks.
Back then, Twitter’s attitude
was “we don’t touch speech,” said
University of Virginia law profes-
sor Danielle Citron, an expert on
online harassment. In 2009, Cit-
ron prepared a three-page, single-
spaced memo for the Twitter
C -suite, explaining the legal defi-
nition of criminal harassment,
true threats and stalking.
Gadde joined Twitter’s legal
team two years later, leaving her
post at the Silicon Valley firm Wil-
son, Sonsini, Goodrich and Rosati.
People who worked with her said
her move was inspired by the Arab
Spring uprising, when pro-
d emocracy activists used Twitter
and other social platforms to orga-
nize protests across the Middle
East. The Arab Spring solidified
the belief among Twitter’s leaders
that their job was to protect
speech, not police it.
Twitter was soon engulfed in
scandal, however. In 2014, online
trolls launched a brutal campaign
against women in the video game
industry. The attacks — which
came to be known as “GamerGate”
— were carried out on multiple
tech platforms. But they were
most visible on Twitter, where
women received highly graphic
threats of violence, some includ-
ing the woman’s address or an
exact time of attack.
The incident was a wake-up call

“Twitter’s left wing bias.” Musk’s
legions of followers have tweeted
calls for her firing, some of them
racist. (Gadde, 47, is Indian Ameri-
can.)
Musk on Tuesday signaled he
would undo the permanent ban
on Trump if he completes the ac-
quisition of Twitter, potentially
undoing years of work from Gad-
de and her team. “I think that was
a mistake,” he said at an event
hosted by the Financial Times.
Twitter colleagues describe
Gadde’s work as difficult but nec-
essary and unmotivated by politi-
cal ideology. Defenders say her
team, known as the trust and safe-
ty organization, has worked pain -
stakingly to rein in coronavirus
misinformation, bullying and oth-
er harmful speech on the site,
moves that necessarily limit some
forms of expression. They have
also disproportionately affected
right-leaning accounts.
But Gadde also has tried to bal-
ance the desire to protect users
with the values of a company built
on the principle of radical free
speech, they say. She pioneered
strategies for flagging harmful
content without removing it,
adopting warning labels and “in-
terstitials,” which cover up tweets
that break Twitter’s rules and give
people control over what content
they see — strategies copied by
Twitter’s much larger rival, Face-
book.
Many researchers and experts
in online harassment say Gadde’s
policies have made Twitter safer
for its roughly 229 million daily
users and say they fear Musk will
dismantle them if the sale goes
through.
“If Musk takes things in the
direction he has been signaling —
which is a rather simplistic view
that more or less anything goes in
the name of free speech — we will
certainly see the platform go back
to square one,” said Rebekah
Tromble, director of the Institute
for Data, Democracy and Politics
at George Washington University.
Whatever happens to her pol-
icies, Gadde signaled at a staff
meeting late last month that her
days at Twitter may be numbered,
telling employees that she would
work to protect their jobs as long
as she is around, according to a
person who attended the meeting.
She did not respond to requests
for comment. Twitter declined to
comment. Musk did not respond
to a request for comment.
On Monday, Musk tweeted:
“Twitter obv has a strong left wing
bias.”
This story is based on inter-
views with 10 current and former
Twitter employees, as well as oth-
ers familiar with decisions made
by Gadde and her team, who spoke
on the condition of anonymity to
describe private company discus-
sions.
“I do believe very strongly —
and our rules are based on this
framework — that free expression
is a fundamental right, that every-
one has a voice and they should be
able to use it,” said Gadde in a 2019
interview with The Washington
Post. “There is a line between do-
ing that and committing what we
call abuse or harassment, and


TWITTER FROM G1


Twitter’s top lawyer has long balanced free speech, safety


MARTINA ALBERTAZZI/BLOOMBERG NEWS
Vijaya Gadde, chief legal officer of Twitter, speaks during the 2019
Wall Street Journal Tech Live c onference in Laguna Beach, Calif.

“She wasn’t a censorship warrior or a

free expression warrior. She is pragmatic

but not doctrinaire.”
Vijaya Gadde’s former colleague, describing the lawyer’s approach
Free download pdf