,QGDWDFRQVXOWDQWDQG&DPEULGJH$QDO\WLFD
employee Christopher Wylie blew the whistle on the
FRPSDQ\7KLVVHWR̆DFKDLQRIHYHQWVWKDWZRXOGODQG
Facebook in the hot seat and Mark Zuckerberg in front
of the Senate Commerce and Judiciary Committees.
Giving this the best possible spin, it’s a newer, better
version of what President Obama’s campaign did:
leveraging clever social-media techniques and new
WHFKQRORJ\WREXLOGDVPRRWKHUPRUHH̆HFWLYH
occasionally underhanded but not outright illegal or
immoral political-advertising industry, which everyone
would be using soon.
A darker interpretation: It’s “weaponized data,” as the
ZKLVWOHEORZHUVKDYHFDOOHGLWSV\RSVWKDWXVH
information-warfare techniques borrowed from
institutions like the Department of Defense to leverage
our information against us, corrupting our democratic
process to the point that we can’t even tell if we’re
voting for (or against) something because we believe it
or because a data-fueled AI knew just what
psychological lever to push. Even applied to
advertisements, this is scary. Did I buy a particular
product because its manufacturer knew just how and
when to make me want it? Which decisions that we
make are our own?
“You might say ‘Well, what happened before the last
election—that was pretty darn malicious,’” says Vasant
Dhar, a professor of data science at the NYU Stern
Center of Business. “Some people might say, ‘I don’t
know—that wasn’t that malicious, there’s nothing
ZURQJZLWKXVLQJVRFLDOPHGLDIRULQÀXHQFHDQG
besides, there’s no smoking gun, there’s no proof that
it actually did anything.’ And that’s a reasonable
position too.”
Did I buy a
particular
product
because its
manufacturer
knew just how
and when to
make me
want it?