Rolling Stone - USA (2020-02)

(Antfer) #1

MI


CH


EL


E^ H


AR


TS


HO


RN


February 2020 | Rolling Stone | 41


a really good sign of how things have changed,”
he says.
Yet in the ways they’ve chosen to police their
platforms, the tech companies have left plenty
of openings for disinformation to spread. In
Facebook’s case, Gleicher says the company
has chosen to root out bad actors — whether
they’re Russian trolls, e-criminals from Iran, or
clickbait profiteers here in the U.S. — not by the
content they post but the behavior of the peo-
ple running the accounts. The company looks
for what it calls “inauthentic behavior,” which
includes creating fake accounts, masking the
true identity of the person or group operating
a Facebook page, and using a network of pages
in close coordination to game Facebook’s algo-
rithm and reach a larger audience. “We have
articulated a set of behaviors that are de-
ceptive, that mislead users, and that vi-
olate our policies,” Gleicher says. “At its
core, it doesn’t matter who’s doing it.
It doesn’t matter what content they’re
sharing. It doesn’t matter what they be-
lieve or don’t believe.”
But that position relieves Facebook
from any obligation to police much of
the content that appears on its platform.
(The company has strict policies for hate
speech, terrorist propaganda, and other
dangerous material.) And Congress has
yet to act, despite lawmakers offering
a slew of proposals to regulate social
media companies to rein in disinforma-
tion, foreign and domestic. One pro-
posed solution is treating social media
companies like TV stations, with the
same rigorous transparency rules. Right
now, Facebook is under no such obliga-
tion to remove a misleading ad or meme
if it doesn’t violate the company’s behav-
ior-centric guidelines.
Facebook’s critics call this a cop-out,
and the company’s announcement last
year that it would not fact-check ads by
elected officials and political candidates
even if they contained blatant lies only
fueled that criticism. These critics say the com-
pany is afraid of angering conservatives who are
quick to cry censorship (despite no evidence to
back them up) while putting the billions to be
made in political advertising over protecting
truth in civil discourse. “It’s very commercially
convenient to take the position that Mark Zuck-
erberg took,” says Dipayan Ghosh, a former
privacy and public-policy adviser at Facebook
who assisted the Obama White House with tech
policy. “It allows him to keep conservatives on
his side, and he gets a lot of money from digi-
tal advertising. They’re going to do everything
they can to claim that.”

O


N DECEMBER 19TH, @TrumpWar-
Room, a rapid-response account run
by President Trump’s re- election
campaign, sent an explosive tweet.
According to @TrumpWarRoom, James Cly-
burn of South Carolina, the third-ranking Dem-
ocrat in the House, had called for hanging the
president on live TV. The pro-

borders, clear enemies, or rules of engagement,
a daily struggle to protect the integrity of elec-
tions and to reassure Americans their democra-
cy is safe. Foote’s relationship with the feds has
dramatically improved since the FBI first called
in late 2016. Inyo County is now part of an in-
formation-sharing network backed by DHS that
pushes out technical alerts. Foote says she and
her team receive intel about new threats almost
every day. Twenty-four hours after a U.S. drone
strike killed Iran’s top military general in early
January, DHS briefed election officials on po-
tential retaliatory cyberattacks by Iran. Anoth-
er DHS tool, called an Albert sensor, alerts her
IT team to malware attacks. Foote knows who
to call at the FBI or DHS if an attack happens.
Foote says she’s never felt more prepared for
protecting her small slice of the vote next No-
vember. But she also recognizes there’s only so
much she can do. The number of personnel in
her office hasn’t grown since the mid- Nineties,
and the local board of supervisors recently
denied her request for an additional election
staffer dedicated to cybersecurity. She worries
about ransomware, a form of attack employed
by hackers to infiltrate a network and lock out
users from their computers or phones until
they pay to regain access. A December 2019
confidential alert by the FBI said reports of an
especially vicious type of ransom ware attack-
ing municipalities, called Ryuk, had recently
spiked by 400 percent. Foote says she’s trou-
bled about what would happen if a ransomware
attack happened during an election and inter-
fered with her ability to do her job.
“I’m the person who’s supposed to be de-
fending against these nation-state actors,” she
says. “It’s not that we’re not up to the task. But
there are certain things we are unable to de-
fend against. When someone has unlimited re-
sources, they have unlimited power to try to
find vulnerabilities in the system.”

O


N THE EVE of the 2018 midterm elec-
tions, a strange website appeared
out of nowhere with an ominous
message. Claiming to be the Amer-
ican division of the Internet Research Agency,
the Kremlin-backed disinformation factory, the
site said the IRA had thousands of Facebook,
Twitter, and Reddit accounts pushing propa-
ganda as well as “allies and spoilers” embed-
ded in various campaigns. The website then
posted a list of all Senate races and said the out-
come was already decided well before all votes
had been cast. It also published a spreadsheet
that listed dozens of social media accounts that
were supposedly part of the IRA’s campaign
to disrupt the midterms. There were obvious
errors in the list of “rigged” elections — Sen.
Jeff Flake (R-Ariz.), who had announced his re-
tirement, was listed as winning — but the social
media accounts looked real, from Jordan Pe-
terson fan pages to Instagram accounts named
“Redneck Army” and “Proud to Be Black.”
The point of all this, it seemed, was to cast
fresh doubt on the midterms. “We control the
voting and counting systems,” the supposed
IRA statement read. “We are choosing for you.”

Only it didn’t work. By the time the list of
social media accounts was released, most of
the accounts were inactive. Acting on a tip from
the FBI a few days before the election, Face-
book had investigated and removed the suspi-
cious accounts.
Heading into the 2016 election, the major
tech companies either pretended the disinfor-
mation problem didn’t exist or that there was
nothing they could do about it. Foreign influ-
ence operations ran wild. An infamous example
was the IRA-run Twitter account @TEN_GOP,
which was apparently registered to a Russian
cellphone number. “It was the Wild West,”
says Ben Nimmo, director of investigations at
Graphika, a social media analysis firm. “Across
the major platforms, they had very broad lati-

tude to get away with stuff.” Mark Zuckerberg
denied Facebook had a Russian interference
problem even after the election — until his com-
pany did its due diligence and Zuckerberg was
hauled before Congress and apologized.
“In 2016, we missed the threat and weren’t
ready,” Nathaniel Gleicher, Facebook’s direc-
tor of cybersecurity policy, tells me during an
hourlong interview at the company’s sprawling
D.C. office. Gleicher joined Facebook in early
2018 and built a team to combat disinformation
that numbers several dozen people worldwide.
Facebook’s new rules have led to a dramatic in-
crease in the number of takedowns — from one
in 2017 to more than 50 in 2019. By the mid-
terms, the three biggest social media platforms
— Facebook, Twitter, and Google, which owns
YouTube — had created internal teams devot-
ed to rooting out disinformation and influence
operations. Gleicher mentions the “IRA in the
USA” takedown as an example of the level of co-
operation between government and tech com-
panies that didn’t exist four years ago. “That is [Cont. on 92]

ON THE
FRONT LINES
Since 2016,
Inyo County
Clerk Kammi
Foote has been
on guard: “It’s
not that we’re
not up to the
task. But there
are certain
things we are
unable to
defend against.
When someone
has unlimited
resources, they
have unlimited
power to try
to find
vulnerabilities
in the system.”
Free download pdf