The Observer - 04.08.2019

(sharon) #1

26


The Observer
04.08.19 Artifi cial intelligence

been doing nothing especially
objectionable .) More seriously, the
city authority in San Francisco ha ve
banned the use of facial -recognition
technologies by the police and
other government agencies ; and
the House of Commons science and
technology committee has called
for British police to stop using it
as well ,  until regulation is in place,
though the then home secretary
(now chancellor) Sajid Javid,  said he
was in favour of trials continu ing.
There is a growing demand for the
technology in shops , with dozens
of companies selling retail facial -
recognition software – perhaps
because it has become pointless
to report shoplifting to the police.
Budgets for policing in England
have been cut in real terms by about
20% since 2010, and a change in the
law in 2014, whereby shoplifting of
goods below a value of £200 was
made a summary offence (ie less
serious , not to be tried by a jury),
meant police directed time and
resources away from shoplifting.
The number of people being arrested
and charged has fallen dramatically,
with less than 10% of shoplifting
now reported. The British Retail
Consortium trade group estimates
that £700 m is lost annually to theft.
Retailers are looking for other
methods. The rapid improvement
in AI technologies, and the dramatic
fall in cost, mean that it is now viable
as one of those other methods.

‘ T


he systems are
getting better
year on year,”
says Josh Davis ,
a psychologist
at the University
of Greenwich who works on facial
recognition in humans and AIs. The
US National Institute of Standards
and Technology assesses the state
of facial recognition every year,
he says, and the ability of the best
algorithms to match a new image
to a face in a database improved
20-fold between 2014 and 2018.
A nd analogously with Moore’s law,
about computer processing power
doubling every year – the cost falls
annually as well.
In ideal environments such
as airport check-ins, where the
face is straight on and well lit
and the camera is high-quality,
AI face recognition is now better
than human, and has been since
at least 2014. In the wild – with
the camera looking down, often
poorly lit and lower-defi nition – it’s
far less effective, says Prof Maja
Pantic , an AI researcher at Imperial
College London. “It’s far from the
99.9% you get with mugshots,” she
says. “But it is good, and moving
relatively fast forward.”
Each algorithm is different, but
fundamentally, they work the same
way. They are given large numbers of
images of people and are told which
ones are the same people; they then
analyse those images to pick out the
features that identify them. Those
features are not things like “size of
ear” or “length of nose”, says Pantic,

but something like textures: the
algorithm assesses faces by gradients
of light and dark, which allow it to
detect points on the face and build
a 3D image. “If you grow a beard or
gain a lot of weight,” she says, “very
often a passport control machine
cannot recognise you, because a
large part of the texture is different.”
But while the algorithms are
understood at this quite high level,
the specifi c things that they use to
identify people are not and cannot
be known in detail. It’s a black box:
the training data goes into the
algorithm, sloshes around a bit, and
produces very effective systems, but
the exact way it works is not clear
to the developer. “We don’t have
theoretical proofs of anything,” says
Pantic. The problem is that there
is so much data: you could go into
the system and disentangle what it
was doing if it had looked at a few
tens of photos, perhaps, or a few
hundred, but when it has looked
at millions, each containing large
amounts of data itself, it becomes
impossible. “The transparency is
not there,” she says.
Still, neither she nor Davis is
unduly worried about the rise of
facial recognition. “I don’t really see
what the big issue is,” Pantic says.
Police prosecutions at the moment
often rely on eyewitnesses, “who
say ‘sure, that’s him, that’s her’, but
it’s not”: at least facial recognition,
she says, can be more accurate.

She is concerned about other
invasions of privacy, of intrusions
by the government into our phones,
but, she says, facial recognition
represents a “fairly limited cost
of privacy” given the gains it can
provide, and given how much privacy
we’ve already given up by having
our phones on us all the time. “The
GPS knows exactly where you are,
what you’re eating, when you go to
the offi ce, whether you stayed out,”
she says. “The faces are the cherry
on top of the pie, and we talk about
the cherry and forget about the pie.”
As with all algorithmic
assessment, there is reasonable
concern about bias. No algorithm is
better than its dataset, and – simply
put – there are more pictures of
white people on the internet than
there are of black people. “We have
less data on dark-skinned people,”
says Pantic. “Large databases of
Caucasian people, not so large on
Chinese and Indian, desperately bad
on people of African descent.” Davis
says there is an additional problem,
that darker skin refl ects less light,
providing less information for the
algorithms to work with. For these
two reasons algorithms are more
likely to correctly identify white
people than black people. “That’s
problematic for stop and search,”
says Davis. Silkie Carlo , the director
of the not-for-profi t civil liberties
organisation Big Brother Watch ,
describes one situation where

an 18-year-old black  man was
“swooped by four offi cers, put up
against a wall, fi ngerprinted, phone
taken, before police realised the face
recognition had got the wrong guy”.
That said, the Facewatch facial -
recognition system is, at least
on white men under the highly
controlled conditions of their offi ce,
unnervingly good. Nick Fisher ,
Facewatch’s CEO , showed me a
demo version; he walked through a
door and a wall-mounted camera in
front of him took a photo of his face;
immediately, an alert came up on his
phone (he’s in the system as an SOI,
so he can demonstrate it). I did the
same thing, and it recognised me as
a face, but no alert was sent and, he
said, the face data was immediately
deleted, because I was not an SOI.
Facewatch are keen to say that
they’re not a technology company
themselves – they’re a data
management company. They provide
management of the watch lists in
what they say is compliance with the
European General Data Protection
Regulation (GDPR). If someone is
seen shoplifting on camera or by a
staff member, their image can be
stored as an SOI; if they are then
seen in that shop again, the shop
manager will get an alert. GDPR
allows these watch lists to be shared
in a “proportionate” way; so if you’re
caught on camera like this once,
it can be shared with other local
Facewatch users. In London, says

How do you


know if you’re


on the watch


list? You’re


not guilty of


anything, in


the legal sense


Continued from page 25

Clockwise from top: software from the
Chinese fi rm Megvii is demonstrated
in Beijing ; testing of facial-recognition
technology in Essex earlier this year;
a protest in Seattle against Amazon’s
Rekognition software last October.
Alamy; New York Times/eyevine; AP

Hot spots: facial -recognition technology around the world


United Kingdom
In the UK, police are c onducting
trials of the technology in
public areas in south Wales,
Leicestershire and London.
However, there are currently no
laws or government policies in
place to regulate its use. Police use
of facial recognition is currently
being challenged in the courts.

China
China has embraced facial
recognition, using it to implement
a national surveillance system
and bolstering its authoritarian
regime. Th e technology is already
pervasive in Chinese society, with
facial recognition used for airport
check-ins , cash withdrawals and
to monitor the attention of school
students. In the Xinjiang region ,
facial recognition is increasingly
used to aid the oppression of
the Uighur Muslims , with the
state collecting their biometric
data, including face scans.

United States
One in two American citizens
is on a law -enforcement facial -
recognition database. Concerns
over lack of regulation and privacy
led to the city of San Francisco
banning the use of facial recognition
by the police in May this year,
with Somerville, Massachusetts ,
following its lead. Dani Ellenby

РЕЛИЗ


ПОДГОТОВИЛА

ГРУППА

"What's News"

VK.COM/WSNWS

РЕЛИЗ ПОДГОТОВИЛА ГРУППА "What's News" VK.COM/WSNWS
Free download pdf