s4cnnd1imema9

(Marcin) #1
April 2019, ScientificAmerican.com 77

THE INTERSECTION
WHERE SCIENCE AND SOCIETY MEET

Illustration by Thomas Pitilli

YouTube Has


a Video for That


But the site’s recommendation
algorithms have a dark side
By Zeynep Tufekci

It was 3 a.m., and the smoke alarm wouldn’t stop beeping. There
was no fire, so I didn’t need to panic. I just had to figure out a way
to quiet the darn thing and tamp down my ire. I had taken out the
battery and pushed and twisted all the buttons to no  avail.
Luckily for me, the possible solutions were all laid out in the
YouTube tutorial I found. The video helpfully walked me through
my options, demonstrating each step. And the fact that it had hun-
dreds of thousands of views reassured me that this might work.
YouTube has become the place to learn how to do anything,
from assembling an Ikea cabinet to making a Bluetooth connec-
tion with your earbuds. It is a font of tutorials, some very good,
some meandering, some made by individuals who have be come
professionals at it and rake in serious sums through ad vert is ing.
But many are uploaded by people who have solved something that
frustrated them and want to share the answer with the  world.
The native language of the digital world is probably video, not
text—a trend missed by the literate classes that dominated the
public dialogue in the predigital era. I’ve noticed that many young
people start their Web searches on YouTube. Besides, Google,


which owns YouTube, highlights videos in its search results.
“How do I” assemble that table, improve my stroke, decide if I’m
a feminist, choose vaccinations, highlight my cheeks, tie my shoe-
laces, research whether climate change is real...? Someone on You-
Tube has an answer. But the site has also been targeted by extrem-
ists, conspiracy theorists and reactionaries who understand its role
as a gateway to information, especially for younger generations.
And therein lies the dark side: YouTube makes money by
keeping users on the site and showing them targeted ads. To keep
them watching, it utilizes a recommendation system powered by
top-of-the-line artificial intelligence (it’s Google, after all). Indeed,
after Google Brain, the company’s AI division, took over You-
Tube’s recommendations in 2015, there were laudatory articles on
how it had significantly increased “engagement”: Silicon Valley–
speak for enticing you to stay on the site longer.
These “recommended” videos play one after the other. Maybe
you finished a tutorial on how to sharpen knives, but the next one
may well be about why feminists are ruining manhood, how vac-
cinations are poisonous or why climate change is a hoax—or a
nifty explainer “proving” the Titanic never hit an  iceberg.
YouTube’s algorithms will push whatever they deem engaging,
and it appears they have figured out that wild claims, as well as
hate speech and outrage peddling, can be particularly  so.
Receiving recommendations for noxious material has become
such a common experience that there has been some loud push-
back. Google did ban a few of the indefensibly offensive high-pro-
file “creators” (though not before helping them expose their views
to millions of people), and recently the company an nounced an
initiative to reduce recommending “borderline content and con-
tent that could misinform users in harmful ways.” Ac cording to
Google, this content might be things like “a phony miracle cure for
a serious illness” or claims that “the earth is flat.” The change, they
say, will affect fewer than 1 percent of all videos.
While it’s good to see some response from Google, the problem
is deep and structural. The business model incentivizes whatev-
er gets watched most. YouTube’s reach is vast. Google’s cheap and
nifty Chromebooks make up more than half the computers in
the K–12 market in the U.S., and they usually come preloaded
with YouTube. Many parents and educators probably don’t real-
ize how much their children and students use  it.
We can’t scream at kids to get off our lawn or ignore the fact
that children use YouTube for a reason: there’s stuff there they
want to watch, just like I really needed to figure out how to un -
plug that beeping catastrophe at 3 a.m. We need to adjust to this
reality with regulation, self-regulation and education. People
can’t see how recommendations work—or how they’re designed
to keep eyes hooked to the screen. We could ask for no YouTube
or “no recommendations” for Chromebooks in schools.
This is just tip of the iceberg of the dangerous nexus of profit,
global scale and AI. It’s a new era, with challenges as real as that
iceberg the Titanic did hit—no matter what the video claims.

JOIN THE CONVERSATION ONLINE
Visit Scientific American on Facebook and Twitter
or send a letter to the editor: [email protected]

Zeynep Tufekci is an associate professor at the University
of North Carolina School of Information and Library Science
and a regular contributor to the New York Times. Her book,
Twitter and Tear Gas: The Power and Fragility of Networked Protest,
was published by Yale University Press in 2017.
Free download pdf