Time USA (2022-02-28)

(EriveltonMoraes) #1

34 Time February 28/March 7, 2022



Sama content moderators started
working at this office in Nairobi
in 2019

building near a slum on the outskirts
of Nairobi, nearly 200 young men and
women from countries across Africa sit
glued to computer monitors, where they
must watch videos of murders, rapes,
suicides, and child sexual abuse.
These young Africans work for Sama,
which calls itself an “ethical AI” out-
sourcing company and is headquartered
in California.
Sama says its mission is to provide
people in places like Nairobi with “dig-
nified digital work.” Its executives can
often be heard saying that the best way
to help poor countries is to “give work,
not aid.” Sama claims to have helped lift
more than 50,000 people in the devel-
oping world out of poverty.
This benevolent public image has
won Sama data- labeling contracts with
some of the largest companies in the
world, including Google, Microsoft, and
Walmart. What the company doesn’t
make public on its website is its rela-
tionship with its client Facebook.
In Nairobi, Sama employees, who
speak at least 11 African languages
among them, toil day and night as Face-
book content moderators: the emer-
gency first responders of social media.
They perform the brutal task of viewing
and removing illegal or banned content
from Facebook before it is seen by the
average user.
Since 2019, this Nairobi office block
has been the epicenter of Facebook’s
content- moderation operation for the
whole of sub-Saharan Africa. Its remit
includes Ethiopia, where Facebook is
trying to prevent content on its plat-
form from contributing to incitement
to violence in an escalating civil war.
Despite their importance to Face-
book, the workers in this Nairobi office
are among the lowest-paid workers for
the platform anywhere in the world,
with some of them taking home as little


as $1.50 per hour, a TIME investigation
found. The testimonies of Sama employ-
ees reveal a workplace culture charac-
terized by mental trauma, intimidation,
and alleged suppression of the right to
unionize. The revelations raise serious
questions about whether Facebook—
which periodically sends its own em-
ployees to Nairobi to monitor Sama’s
operations—is exploiting the very peo-
ple upon whom it is depending to en-
sure its platform is safe in Ethiopia and
across the continent. And just as Face-
book needs them most, content modera-
tors at Sama are leaving the company in
droves because of poor pay and working
conditions, with six Ethiopian employ-
ees resigning in a single week in January.
This story is based on interviews
with more than a dozen current and
former Sama employees and hundreds
of pages of documents including com-
pany emails, paychecks, and contracts.
Most sources spoke on condition of an-
onymity for fear of legal consequences
if they disclosed the nature of their
work or Facebook’s involvement. The
Signals Network, a whistle-blower

protection NGO, provided psychologi-
cal and legal support for some sources
quoted in this story.
“The work that we do is a kind of
mental torture,” one employee, who
currently works as a Facebook content
moderator for Sama, tells TIME. “What-
ever I am living on is hand-to-mouth.
I can’t save a cent. Sometimes I feel I
want to resign. But then I ask myself:
What will my baby eat?”
TIME is aware of at least two Sama
content moderators who chose to resign
after being diagnosed with mental ill-
nesses including post traumatic stress
disorder (PTSD), anxiety, and depres-
sion. Many others described continu-
ing with work despite trauma because
they had no other options. While Sama
employs wellness counselors to provide
workers with on-site care in Nairobi,
most of the content moderators TIME
spoke to said they generally distrust the
counselors. One former counselor says

TECHNOLOGY


IN A DRAB OFFICE

Free download pdf