2019-08-31 New Scientist International Edition

(Tuis.) #1
31 August 2019 | New Scientist | 11

Internet Data privacy


Chris Stokel-Walker Chris Stokel-Walker


SOCIAL media platforms struggling
to tackle the tide of misinformation
and unsuitable content could cut
its flow by a fifth by better
explaining their rules. That is
the finding of a large-scale study
of 32 million posts on popular
discussion site Reddit.
On Reddit, volunteer moderators
clear forums, called subreddits,
of unsuitable or off-topic material.
Moderators take different
approaches, however. Some explain
why they have removed the content,
but about 99 per cent simply take it
down without explanation. Shagun
Jhaver at the Georgia Institute of
Technology and his colleagues have
found that those who have had their
posts removed – with or without
explanation – are less likely to
continue posting. But those who
aren’t provided with a reason why
their content was taken down have
a higher likelihood of further posts
being removed than those who are
given an explanation.
Another study by the team found
that 37 per cent of Reddit users
surveyed didn’t understand why
their post was removed, and 29 per
cent felt frustrated that it had been.
The group calculated that if all
post removals were accompanied
with an explanation, the odds of
future removals would drop by
20.8 per cent. The team will present
the work at the Conference on
Computer-Supported Cooperative
Work and Social Computing in
Texas in November.
While the system works in small
communities, it may be hard to
scale across a larger site such as
YouTube, which has been criticised
for its opaque rules on what is
acceptable content.
“Moderators have to be careful
about how they articulate their
policies,” says Kat Lo at the
University of California, Irvine.
“It has to be able to move between
many different types of context.” ❚


Revealing why posts


are moderated helps


us comply with rules


MILLIONS of gay people
living in countries where
homosexuality is outlawed
could be put at risk by
Facebook’s advertising
practices. This is because the
firm allows advertisers to target
people on the basis of their
interests, including sexual ones.
Ángel Cuevas Rumín at
Charles III University of Madrid,
Spain, and his colleagues
analysed the list of options
available for targeting adverts
on Facebook. They found that
about 2000 of the options
would be classed as “sensitive”
information under Europe’s
recent GDPR data protection
law. These include a person’s
politics, race or sexuality.
Some two-thirds of Facebook
users in the 197 countries and
states the team looked at were
tagged with at least one such
preference, accounting for a
fifth of the overall population.

In Saudi Arabia, where
homosexuality can be punished
with death, the team found in
February that 540,000 people
were labelled as having an
interest in homosexuality.
The team revisited that number
in August and it had nearly
doubled to 940,000 people.
Overall, Cuevas’s team found
that there were more than

4.2 million people tagged as
interested in homosexuality
living in countries where
homosexuality is illegal.
These people could be targeted
using Facebook’s ad tools (arxiv.
org/abs/1907.10672).
While there is no suggestion
that anyone has been identified
or killed as a result of this

practice, such information
could be used to identify people
and collect information on
them. For example, an advert
directed at a particular group
could offer a prize to people if
they enter their personal details.
Facebook says that just
because someone shows an
interest in something doesn’t
mean they have that attribute.
You could like a page about gay
men, for example, without
being a gay man yourself.
However, there is likely to be
overlap between the two groups.
“The interest targeting
options we allow in ads reflect
people’s interest in topics, not
personal attributes,” Facebook
told New Scientist. “People can’t
discriminate by excluding
interests such as homosexuality
when they build an ad.” The firm
says it recently removed more
than 5000 targeting options.
Collecting such data is a legal
grey area. In Europe, there are
stronger legal protections for
sensitive data than there are for
other types of personal data.
However, data protection
experts are torn over whether
Facebook is breaking any laws.
“Facebook is in the wrong for
sure, as far as EU data protection
law is concerned,” says Ed Boal
at Stephenson Law in Bristol,
UK. Sandra Wachter at the
Oxford Internet Institute, UK,
isn’t so sure. “If the argument
being made is nobody is
inferring sexual orientation but
assuming an interest in sexual
orientation, that brings us to an
unclear legal perspective,” she
says. “We need to broaden data
protection in a more sensible
and holistic way.” ❚

Facebook’s data collection


may put gay people at risk


HO

CU
S-F

OC
US

/GE

TT

Y

If you “Like” a Facebook
page, the data is used
to record your interests

940,
people in Saudi Arabia are labelled
as interested in homosexuality
Free download pdf