The Washington Post - USA (2021-10-25)

(Antfer) #1

A16 EZ RE THE WASHINGTON POST.MONDAY, OCTOBER 25 , 2021


facebook under fire


How the firm left its largest global


market vulnerable to hate speech


WASHINGTON POST; FACEBOOK IMAGES; ISTOCK

PRADEEP GAUR/SOPA IMAGES/LIGHTROCKET/GETTY IMAGES
A member of the All India Students Federation teaches protesting
farmers about social media in Ghaziabad, India, earlier this year.

used for fear of retribution, re-
called huddling in his room one
evening as groups of men
marched outside chanting death
to Kashmiris. His phone buzzed
with news of students from Kash-
mir being beaten in the streets —
along with more violent Facebook
messages.
“Hate spreads like wildfire on
Facebook,” Junaid said. “None of
the hate speech accounts were
blocked.”
For all of Facebook’s troubles in
North America, its problems with
hate speech and disinformation
are dramatically worse in the
developing world. Internal com-
pany documents made public Sat-
urday reveal that Facebook has
meticulously studied its ap-
proach abroad — and was well
aware that weaker moderation in
non-English-speaking countries
leaves the platform vulnerable to
abuse by bad actors and authori-
tarian regimes.
This story is based on those
documents, known as the Face-
book Papers, which were dis-
closed to the Securities and Ex-
change Commission by whistle-
blower Frances Haugen, and
composed of research, slide decks
and posts on the company mes-
sage board — some previously
reported by the Wall Street Jour-
nal. It is also based on documents
independently reviewed by The
Post, as well as more than a dozen
interviews with former Facebook
employees and industry experts
with knowledge of the company’s
practices abroad.
The SEC disclosures, provided
to Congress in redacted form by
Haugen ’s legal counsel and re-
viewed by a consortium of news
organizations including The Post,
suggest that as Facebook pushed
into the developing world, it
didn ’t invest in comparable pro-
tections.
According to one 2020 summa-
ry, although the United States
comprises less than 10 percent of
Facebook’s daily users, the com-
pany’s budg et to fight misinfor-
mation was heavily weighted
toward America, where 84 per-
cent of its “global remit/language
cove rage ” was allocated. Just
16 percent was earmarked for the
“Rest of World,” a cross-continent
groupi ng that included India,
France and Italy.
Facebook spokesperson Dani
Lever said the company had
ma de “progress” and had “dedi-
cated teams working to stop
abuse on our platform in coun-
tries where there is heightened
risk of conflict and violence. We
also have global teams with na-
tive speakers reviewing content
in over 70 languages along with
experts in humanitarian and hu-
man rights issues.”
Many of these additions had
come in the past two years.
“We’ve hired more people with
language, country and topic ex-
pertise. We’ve also increased the
number of team members with
work experience in Myanmar and
Ethiopia to include former hu-
manitarian aid workers, crisis re-
sponders, and policy specialists,”
Lever said.
Meanwhile, in India, Lever
said, the “hypot hetical test ac-
count inspired deeper, more rig-
orous analysis of our recommen-
dation systems.”
Globally there are more than
90 languages that have over 10
million speakers. In India alone,
the government recognizes 122
languages, according to its 2001
census.
In India, where the Hindu-na-
tionalist Bharatiya Janata Party
— part of the coalition behind
Modi’s political rise — deploys
inflammatory rhetoric against
the country’s Muslim minority,
misinformation and hate speech
can translate into real-life vio-
lence, making the stakes of these
limited safety protocols particu-
larly high. Researchers have doc-
umented the BJP using social
media, including Facebook and
WhatsApp, to run complex propa-
ganda campaigns that scholars
say play to existing social ten-
sions with Muslims.
Members from the Next Billion
Netw ork, a collective of civil soci-
ety actors working on technolo-
gy-related harms in the global
south, warned Facebook officials
in the United States that un-
checked hate speech on the plat-
form could trigger large-scale
communal violence in India, in
meetings held between 2018 and
2019, according to three people
with knowledge of the matter,
who spoke on the condition of


FACEBOOK FROM A


anonymity to describe sensitive
matters.
Despite Facebook’s assurances
it would increase moderation ef-
forts, when riots broke out in
Delhi last year, calls to violence
against Muslims remained on the
site, despite being flagged, ac-
cording to the group. Gruesome
images, claiming falsely to depict
violence perpetrated by Muslims
during the riots, were found by
The Post. Facebook labeled them
with a fact check, but they re-
mained on the site as of Saturday.
More than 50 people were
killed in the turmoil, the majority
of them Muslims.
“They were told, told, told, and
they didn’t do one damn thing
about it,” said a member of the
group who attended the meet-
ings. “The anger [from the global
south] is so visceral on how dis-
posable they view our lives.”
Facebook said it removed con-
tent that praised, supported or
represented violence during the
riots in Delhi.
India is the world’s largest de-
mocracy and a growing economic
powerhouse, making it more of a
priority for Facebook than many
other countries in the global
south. Low-cost smartphones
and cheap data plans have led to a
telecom revolution, with millions
of Indian users coming online for
the first time every year. Face-
book has made great efforts to
capture these customers, and its
signature app has 410 million
users in the country, according to
the Indian government, more
than the entire population of the
United States.
The company activated large
teams to monitor the platform
during major elections, dis-
patched representatives to en-
gage with activists and civil soci-
ety groups, and conducted re-
search surveying Indian people,
finding many were concerned
about the quantity of misinfor-
mation on the platform, accord-
ing to several documents.
But despite the extra attention,
the Facebook that Indians inter-
act with is missing many of the
key guardrails the company de-
ployed in the United States and
other mostly English-speaking
countr ies for years. One docu-
ment stated that Facebook had
not developed algorithms that
could detect hate speech in Hindi
and Bengali, despite their being
the fourth- and seventh-most
spoken languages in the world,
respec tively. Other documents
showed how political actors

spammed the social network with
accounts spreading anti-Muslim
messages across people’s news
feeds in violation of Facebook’s
rules. The company said it intro-
duced hate-speech classifiers in
Hindi in 2018 and Bengali in
2020; systems for detecting vio-
lence and incitement in Hindi
and Bengali were added in 2021.
Pratik Sinha, co-founder of Alt
News, a fact-checking site in In-
dia that routinely debunks viral
fake and inflammatory posts,
said that while misinformation
and hate speech proliferate
across social networks, Facebook
sometimes doesn’t take down bad
actors.
“Their investment in a coun-
try’s democracy is conditional,”
Sinha said. “It is beneficial to care
about it in the U.S. Banning
Trump works for them there.
They can’t even ban a small-time
guy in India.”

‘A t-risk countries’
Facebook’s mission statement
is to “bring the world closer to-
gether,” and for years, voracious
expansion into markets beyond
the United States has fueled its
growth and profits.
Social networks that let citi-
zens connect and organize be-
came a route around govern-
ments that had controlled and
censored centralized systems like

TV and radio. Facebook was cel-
ebrated for its role in helping
activi sts organize protests
against authoritarian govern-
ments in the Middle East during
the Arab Spring.
For millions of people in Asia,
Africa and South America, Face-
book became the primary way
they experience the Internet.
Facebook partnered with local
telecom operators in countries
such as Myanmar, Ghana and
Mexico to give free access to its
app, along with a bundle of other
basic services like job listings and
weather reports. The program,
called “Free Basics,” helped mil-
lions get online for the first time,
cementing Facebook’s role as a
communication platform around
the world and locking millions of
users into a version of the Inter-
net controlled by an individual
company. (While India was one of
the first countries to get Free
Basics in 2015, backlash from
activists who argued that the pro-
gram unfairly benefited Facebook
led to its shutdown.)
In late 2019, the Next Billion
Network ran a multicountry
study, separate from the whistle-
blower’s documents, of Face-
book’s moderation and alerted
the company that large volumes
of legitimate complaints, includ-
ing of death threats, were being
dismissed in countries through-

out the global south, including
Pakistan, Myanmar and India,
because of technical issues, ac-
cording to a copy of the report
reviewed by The Post.
It found that cumbersome re-
porting flows and a lack of trans-
lations were discouraging users
from reporting bad content, the
only way content is moderated in
many of the countries that lack
more automated systems. Face-
book’s community standards, the
set of rules that users must abide
by, were not translated into Urdu,
the national language of Paki-
stan. Instead, the company
flipped the English version so it
read from right to left, mirroring
the way Urdu is read.
In June 2020, a Facebook em-
ployee posted an audit of the
company’s attempts to make its
platform safer for users in “at-risk
countries,” a designation given to
nations Facebook marks as espe-
cially vulnerable to misinforma-
tion and hate speech. The audit
showed Facebook had massive
gaps in coverage. In countries
including Myanmar, Pakistan
and Ethiopia, Facebook didn’t
have algorithms that could parse
the local language and identify
posts about covid-19. In India and
Indonesia, it couldn’t identify
links to misinformation, the audit
showed.
In Ethiopia, the audit came a
month after its government post-
poned federal elections, a major
step in a buildup to a civil war
that broke out months later. In
addition to being unable to detect
misinformation, the audit found
Facebook also didn’t have algo-
rithms to flag hate speech in the
country’s two big gest local lan-
guages.
After negative coverage, Face-
book has made dramatic invest-
ments. For example, after a sear-
ing United Nations report con-
nected Facebook to an alleged
genocide against the Rohingya
Muslim minority in Myanmar,
the region became a priority for
the company, which began flood-
ing it with resources in 2018,
according to interviews with two
former Facebook employees with
knowledge of the matter, who,
like others, spoke on the condi-
tion of anonymity to describe
sensitive matters.
Facebook took several steps to
tighten security and remove viral
hate speech and misinformation
in the region, according to multi-
ple documents. One note, from
2019, showed that Facebook ex-
panded its list of derogatory

terms in the local language and
was able to catch and demote
thousands of slurs. Ahead of
Myanmar’s 2020 elections, Face-
book launched an intervention
that promoted posts from users’
friends and family and reduced
viral misinformation, employees
found.
A former employee said that it
was easy to work on the compa-
ny’s programs in Myanmar, but
there was less incentive to work
on problematic issues in lower-
profile countries, meaning many
of the interventions deployed in
Myanmar were not used in other
places.
“Why just Myanmar? That was
the real tragedy,” the former em-
ployee said.

‘Pigs’ and fearmongering
In India, internal documents
suggest Facebook was aware of
the number of political messages
on its platforms. One internal
post from March shows a Face-
book employee believed a BJP
worker was breaking the site’s
rules to post inflammatory con-
tent and spam political posts. The
researcher detailed how the
worker used multiple accounts to
post thousands of “politically-
sensitive” messages on Facebook
and WhatsApp during the run-up
to the elections in the state of
West Bengal. The efforts broke
Facebook’s rules against “coordi-
nated inauthentic behavior,” the
employee wrote. Facebook de-
nied that the operation constitut-
ed coordinated activity but said it
took action.
A case study about harmful
networks in India shows that
pages and groups of the Rashtriya
Swayamsevak Sangh, an influen-
tial Hindu-nationalist group as-
sociated with the BJP, promoted
fearmongering anti-Muslim nar-
ratives with violent intent. A
number of posts compared Mus-
lims to “pigs” and cited misinfor-
mation claiming the Koran calls
for men to rape female family
members.
The group had not been
flagged, according to the docu-
ment, given what employees
called “political sensitivities.” In a
slide deck in the same document,
Facebook employees said the
posts also hadn’t been found be-
cause the company didn’t have
algorithms that could detect hate
speech in Hindi and Bengali.
Facebook in India has been
repeatedly criticized for a lack of
a firewall between politicians and
the company. One deck on politi-
cal influence on content policy
from December 2020 acknowl-
edged the company “routinely
makes exceptions for powerful
actors when enforcing content
policy,” citing India as an exam-
ple.
“The problem which arises is
that the incentives are aligned to
a certain degree,” said Apar Gup-
ta, executive director of the Inter-
net Freedom Foundation, a digi-
tal advocacy group in India. “The
government wants to maintain a
level of political control over on-
line discourse, and social media
platforms want to profit from a
very large, sizable and growing
market in India.”
Facebook says that its global
policy teams operate indepen-
dently and that no single team’s
opinion has more influence than
the other.
Earlier this year, India enacted
strict new rules for social media
firms, increasing government
powers by requiring them to re-
move any content deemed unlaw-
ful within 36 hours of being noti-
fied. The rules have sparked fresh
concerns about government cen-
sorship of U.S.-based social media
networks. They require compa-
nies to have an Indian resident on
staff to coordinate with local law
enforcement agencies. The com-
panies are also required to have a
process in which people can di-
rectly share complaints with the
social media networks.
But Junaid, the Kashmiri col-
lege student, said Facebook had
done little to remove the hate-
speech posts against Kashmiris.
He went home to his family after
his school asked Kashmiri stu-
dents to leave for their own safety.
When he returned to campus 45
days after the 2019 bombing, the
Facebook post from a fellow stu-
dent calling for Kashmiris to be
shot was still on their account.
[email protected]
[email protected]
[email protected]
[email protected]

Regine Cabato in Manila contributed
to this report.

“Their investment in a country’s


democracy is conditional.”
Pratik Sinha, co-founder of Alt News, a fact-checking site in India
Free download pdf