The Economist - USA (2019-08-17)

(Antfer) #1

60 TheEconomistAugust 17th 2019


1

P


owered byadvances in artificial intelli-
gence (ai), face-recognition systems are
spreading like knotweed. Facebook, a so-
cial network, uses the technology to label
people in uploaded photographs. Modern
smartphones can be unlocked with it.
Some banks employ it to verify transac-
tions. Supermarkets watch for under-age
drinkers. Advertising billboards assess
consumers’ reactions to their contents.
America’s Department of Homeland Secu-
rity reckons face recognition will scruti-
nise 97% of outbound airline passengers
by 2023. Networks of face-recognition
cameras are part of the police state China
has built in Xinjiang, in the country’s far
west. And a number of British police forces
have tested the technology as a tool of mass
surveillance in trials designed to spot
criminals on the street.
A backlash, though, is brewing. The au-
thorities in several American cities, in-
cluding San Francisco and Oakland, have
forbidden agencies such as the police from
using the technology. In Britain, members
of parliament have called, so far without

success, for a ban on police tests. Refuse-
niks can also take matters into their own
hands by trying to hide their faces from the
cameras or, as has happened recently dur-
ing protests in Hong Kong, by pointing
hand-held lasers at cctvcameras. to dazzle
them (see picture). Meanwhile, a small but
growing group of privacy campaigners and
academics are looking at ways to subvert
the underlying technology directly.

Put your best face forward
Face recognition relies on machine learn-
ing, a subfield of aiin which computers
teach themselves to do tasks that their pro-
grammers are unable to explain to them ex-
plicitly. First, a system is trained on thou-
sands of examples of human faces. By

rewarding it when it correctly identifies a
face, and penalising it when it does not, it
can be taught to distinguish images that
contain faces from those that do not. Once
it has an idea what a face looks like, the sys-
tem can then begin to distinguish one face
from another. The specifics vary, depend-
ing on the algorithm, but usually involve a
mathematical representation of a number
of crucial anatomical points, such as the lo-
cation of the nose relative to other facial
features, or the distance between the eyes.
In laboratory tests, such systems can be
extremely accurate. One survey by the
nist, an America standards-setting body,
found that, between 2014 and 2018, the
ability of face-recognition software to
match an image of a known person with
the image of that person held in a database
improved from 96% to 99.8%. But because
the machines have taught themselves, the
visual systems they have come up with are
bespoke. Computer vision, in other words,
is nothing like the human sort. And that
can provide plenty of chinks in an algo-
rithm’s armour.
In 2010, for instance, as part of a thesis
for a master’s degree at New York Universi-
ty, an American researcher and artist
named Adam Harvey created “cv[comput-
er vision] Dazzle”, a style of make-up de-
signed to fool face recognisers. It uses
bright colours, high contrast, graded shad-
ing and asymmetric stylings to confound
an algorithm’s assumptions about what a
face looks like. To a human being, the result

Fooling Big Brother

Face off


As face-recognition technology spreads, so do ideas for subverting it

Science & technology


61 A “nuclear”missileexplodes
62 TreatmentsforEbola
62 Leopard seals share their supper

Also in this section
Free download pdf