Apple Magazine - USA (2019-09-20)

(Antfer) #1

It’s also expanding its definition of terrorism
to include not just acts of violence attended
to achieve a political or ideological aim, but
also attempts at violence, especially when
aimed at civilians with the intent to coerce
and intimidate.


Facebook has been working to limit the spread
of extremist material on its service, so far
with mixed success. In March, it expanded
its definition of prohibited content to include
U.S. white nationalist and white separatist
material as well as that from international
terrorist groups. It says it has banned 200 white
supremacist organizations and 26 million pieces
of content related to global terrorist groups like
ISIS and al Qaeda.


Extremist videos are just one item in a long
list of troubles Facebook faces. It was fined $5
billion fine by U.S. regulators over its privacy
practices. A group of state attorneys general
has launched its own antitrust investigation
into Facebook. And it is also part of broader
investigations into “big tech” by Congress and
the U.S. Justice Department.


More regulation might be needed to deal with
the problem of extremist material, said Dipayan
Ghosh, a former Facebook employee and White
House tech policy adviser who is currently a
Harvard fellow.


“Content takedowns will always be highly
contentious because of the platforms’ core
business model to maximize engagement,”
he said. “And if the companies become too
aggressive in their takedowns, then the other
side — including propagators of hate speech —
will cry out.”

Free download pdf