run by Jennifer O’Malley Dillon, an
alumna of the Obama and Clinton cam-
paigns. Before the operation could get
off the ground, Dillon left to lead Beto
O’Rourke’s ill-fated Presidential bid.
Since then, the Democratic Data Ex-
change, which is now run by a Demo-
cratic operative named Lindsey Schuh
Cortes, has gone unmentioned in the
press. An official familiar with the ex-
change told me, “We are staying small
and quiet for now, by design. We’re not
playing in the primaries, but the goal is
to be up and running in time for the
general. We hope that all the Democrats
who are no longer in the race will hand
over their data at that time, but partici-
pation will be voluntary.” If Bloomberg,
a multibillionaire, loses the nomination
to Bernie Sanders, who intends to sharply
raise taxes on billionaires, it’s possible
that Bloomberg would transfer the data
his campaign acquired to the Demo-
cratic Data Exchange, in the common
interest of defeating Donald Trump. It’s
also possible that he would refuse.
A
t Facebook.com/Business, there are
dozens of “success stories”: case
studies showing how Facebook ads helped
a menstrual-underwear company by
“broadening brand awareness,” or how
an artisanal-jewelry company sold brace-
lets on Instagram. The case studies use
internal Facebook data to demonstrate
an ad campaign’s success through quan-
titative metrics. One details how the 2014
reëlection campaign of Rick Scott, the
Republican governor of Florida, used
Spanish-language ads on Facebook to
target Latino soccer fans (“Buena suerte,
Team USA!”). Andrew Abdel-Malik,
the R.N.C.’s state director of digital strat-
egy, is quoted in the case study: “Face-
book Ads provided us with unique tar-
geting capabilities.. .to reach different
sub-groups of Hispanic voters in ways
that were simply not feasible on TV and
radio.” Scott was reëlected by a single
percentage point. Four years later, he was
elected to the U.S. Senate in a race so
close that it triggered a recount.
The case studies can be filtered by in-
dustry, using an alphabetical drop-down
menu. In 2018, the journalist Sam Bid-
dle, writing for the Intercept, noticed
that the Rick Scott case study had been
buried, and that the “Government and
Politics” category had been quietly re-
moved from the menu. (“Gaming” is now
followed by “Health and Pharmaceuti-
cals.”) This did not mean that Facebook
had stopped selling ads to political cam-
paigns, just that the downside of draw-
ing attention to the fact had started to
outweigh the upside. (Facebook has since
launched a site devoted to government
and politics, with no success stories.) A
former Facebook employee told me that,
after the 2016 election, there was some
internal chatter about drafting a case
study that would demonstrate, in great
detail, how Facebook had been a deci-
sive factor in Trump’s victory. “It would
have been one of the most extensive and
convincing ones on the whole site,” the
person told me. “The evidence was over-
whelming. But, given the mood at the
time, there was no way they were going
to put that out there.”
In May, 2018, hoping to address con-
cerns about dark posts and other con-
troversial practices, Facebook built the
Ad Library, which started to archive all
political and issue-oriented ads that ran
on the platform from that point on. “In
meetings, if you bring up problems like
misinformation, you’ll hear, ‘Well, we
have the Ad Library now,’ ” the con-
cerned Facebook employee told me.
“The argument is, ‘If we put all the in-
formation out there, then people will
find it and become better informed’—
even though it’s clear that that’s not ac-
tually happening.”
In March, 2018, three M.I.T. com-
puter scientists published a paper in Sci-
ence comparing the dissemination of false
rumors on Twitter to the dissemination
of actual news articles. They found that
the fake stories spread faster, in part be-
cause they were more likely to provoke
an immediate emotional response in users.
The same phenomenon appears to hold
true for other social-media platforms
and to apply to misinformation as well
as fearmongering, rage bait, and racist
propaganda, all of which go viral more
readily than calm, patient deliberation.
“Stuff that has a more alarmist and hy-
perbolic tone, or that makes people afraid
or upset, is just going to travel better,”
the Facebook employee told me. “That
fits with human nature, and it’s how the
platform is designed.” Without funda-
mentally altering Facebook’s News Feed
algorithm, or the company’s underlying
business model, this is unlikely to change.
This past fall, the Trump campaign
ran a Facebook ad premised on the in-
cendiary but false notion that the villain
of the Ukraine corruption scandal was
not Trump but Joe Biden. (Parscale re-
peated such claims several times on Twit-
ter, adding, “The swamp! They’re play-
ing us and the media is their lap dog!”)
The ad, predictably, went viral. Biden’s
campaign wrote a letter to Facebook,
asking the company to take it down.
Facebook’s head of global-elections pol-
icy, a former Rudolph Giuliani campaign
official named Katie Harbath, explained
“Go on, sweetie—show Aunt Catherine how fulfilling you are.” that the ad would stay up because the