28 TIME April 11/April 18, 2022
made to feel so very comfortable, sur-
rounded by like-minded friends, per-
haps thousands of them. It’s big enough
to feel like we’re “in society,” but of
course it’s actually quite small, a min-
ute corner of the world. The ways we
disagree with others outside our group
are iltered straight to us, via algo-
rithms, and the ways we agree with one
another are likewise iltered away from
us, making them essentially invisible.
That automated boosting of shame-
based outrage triggers us, and we get
habituated to performing acts of virtue
signaling. We jump on the shame train
to get our tiny little dopamine boosts
for being outraged and for our righ-
teousness. That we get accolades from
our inner circle only serves to convince
us once again that we’re in the right
and that people outside our circles are
living in sick cults. This turns what
should be a socially cohesive act into a
mere performance, as we get stuck for
hours on the platforms, tearing each
other down for the
sake of increasing the
proits of Big Tech.
What’s particularly
tragic about all of this is
that the shame doesn’t
work at all; it is inher-
ently misdirected. For
shame to work, in the
sense of persuading
someone to behave,
we irst need to share
norms and even a sense
of trust, and second,
the target of the shame
needs to have the ex-
pectation that their
better behavior will be
noticed. Those precon-
ditions are rarely met online.
We have had diferences of opin-
ions for a long time; that’s nothing
new. By pitting us against one an-
other in these endless shame spirals,
Big Tech has successfully prevented
us from building solidarity. The irst
step is for us to critically observe their
manipulations and call them what
they are: shame machines.
O’Neil is the author of The Shame
Machine: Who Proits in the New
Age of Humiliation
SHAME IS A VISCERAL, INSTINCTUAL RESPONSE. WE
react violently to shaming by others, either by feeling
shame or by feeling outraged at the attempt. This human
hard wiring, which historically salvaged our reputations
and preserved our lives, is being hijacked and perverted
by the big tech companies for proit. In the process, we are
needlessly pitted against one another. It doesn’t have to
be like this. What I’ve learned—in part from very personal
experience—is that shame comes in a number of forms, and
the better we understand it, the better we can ight back.
Whereas shame is primarily a useful social mechanism
that coerces its target into conforming with a shared norm,
the kind of shaming that often goes viral on social media is
a punching- down type of shame where the target cannot
choose to conform even if they tried. That obese woman
who fell over in her scooter at Walmart? Viral. That over-
dose victim? Shamed.
Shame’s secondary goal is arguably more efective on so-
cial media, namely to broadcast the norm for
everyone to see. When we see yet another
phone video of an outrageous public “Karen”
situation, it can conceivably be seen as a
learning situation for everyone else.
But what exactly are we learning? The
ensuing viral shame is swift and overly sim-
plistic, often leaving little context or right
to due process. When we do hear further
from the target, the shame tends to have
backired, leaving the alleged Karen dei-
ant, inding community with equally dei-
ant others. Finally, the underlying societal
problem exposed by a Karen episode is left
unaddressed: that white women hold out-
size power over others, especially Black
men, because of a historical bias in policing.
As poorly as shame plays out, it is exactly
how the big tech companies have designed
it. I should know—I used to work as a data scientist in the
world of online ads. I would decide who deserved an op-
portunity and who did not, based on who had spent money
in the past and who hadn’t.
Most online algorithms quantify and proile you, putting
a number on how much you’re worth, whether it’s to sell
you a luxury item or to prey upon you if they deem you vul-
nerable to gambling, predatory loans, or cryptocurrencies.
In turn, the advertisers who ind you igure out your weak-
nesses and deftly exploit them. When I realized I was help-
ing build a terrible system, I got out.
FOR SOCIAL MEDIA, the data scientists are interested in
only one thing: sustained attention. That’s why online we are
We rst
need to
share
norms
and even
a sense
of trust
NATION
How Big Tech
weaponizes shame
BY CATHY O’NEIL
THE VIEW ESSAY
ILLUSTRATIONS BY GIULIA NERI FOR TIME