or natural disasters. Technologists at the
University of California have already released
an app called MyShake, that turns mobile
phones into portable seismology tools that
have proved capable of detecting even mild
earthquakes (more than 200, at the time of
writing). And they’re working on an
inevitable prediction function now.
Meanwhile, the concept of predictive
policing actually precedes the social media
era by more than a decade. In the early ’90s
systems researcher Andreas Olligschlaeger
analysed two years worth of 911 call data
and produced predictions on which parts of
a small area of Pittsburgh would see rises
in crime. His mathematical model
outperformed the standard estimates,
which are produced by simple averaging.
Today, over 90 locations worldwide,
including a number of US cities and areas of
Puerto Rico and South Africa, use a network
of embedded sensors called ShotSpotter to
detect, locate and log gunshots. Meanwhile,
in the desperate ganglands of Mexico,
human rights researchers are using predictive
technology to locate the hidden graves of some
of the estimated 30,000 victims of drugs cartels.
But it need hardly be pointed out that
the construction and use of this new digital
brain by political organisations is not a
straightforwardly good thing. To get
a whiff of why, we need only glance eastwards
towards China, where these advanced
technologies are being used by the highly
savvy communist government. Every citizen
of China has a personal file that has a depth
and complexity that would shame even
Google. As reporters from The Wall Street
Journal have revealed, jaywalkers have their
faces scanned and, by the time they’ve
reached the other side of the road, their
image has appeared on a screen with the
words, ‘JAYWALKERS WILL BE
CAPTURED USING FACIAL-
RECOGNITION TECHNOLOGY’.
Similar cameras have been erected at Chinese
subway stations, airports, busy streets and,
perhaps most troublingly, churches – the
atheistic communists are no fans of the
religious. In 2015, agencies announced
their desire for a, “omnipresent, completely
connected, always on and fully controllable”
network of them. (On the upside,
customers at KFC in Beijing have food
recommendations made on the basis of
facial scans which look at metrics such
as age and gender.)
In the west, too, there are companies
working on facial recognition technology
that claims to be able to predict a huge
amount about future behaviour, on the basis
of appearance alone. One of them is Israeli
start-up Faception that says in trials, its
system successfully picked out nine of the
11 2016 Paris attackers.
Faception cofounder David Gavriel
explains that they started off working for
stores, tailoring digital ads for particular
kinds of customer. “You can tell who is
the quick buyer and who is hesitating, who
is a leader or influencer,” he says. “We’re
looking at facial structure and we do it
from a single image. One camera in a mall
can tell in less than a second what kind
of buyer it is.”
Today, they claim to be able to pick
everyone from bingo players to white-collar
criminals to paedophiles out from the crowd.
Gavriel declines to be drawn on exactly how
they’re doing this, citing the need to keep
proprietary technology secret. But, he says,
“There are clusters of people. You’re born
with your tendencies. You are born with
your character. It’s in your DNA.”
Others state that such claims need be
treated with caution.
“I’m dubious,” Professor David Perrett,
a facial recognition expert at Scotland’s
University of St Andrews tells GQ. While
it’s true that certain personality traits are
sometimes visible in the face, “the effect size
is tiny,” says Perret. “Accuracy is very low.”
Facial recognition in general, he adds, “is
difficult for all sorts of reasons – lighting
conditions, someone can wear a hat or
glasses or grow a beard. If people don’t want
to display who they are in a very obvious
way, it’s very difficult to make any accurate
judgement.” The same holds true of
behaviour. “It may be possible to spot
anxiety, for example, but that’s a long way
from being able to spot a terrorist.”
It may be true that making deep
predictions about individual behaviour
is a lot more complicated than analysing
a single picture of our face, but there
seems little doubt that our future will be
clairvoyant. Some of it will be creepy and
dangerous, some of it will be life-saving and
incredible. You don’t need to crystal ball,
though, to know that it’s coming. n
Do we want these
deeply personal
observations being
held on remote servers?
MEN OF THE YEAR 2017 GQ.COM.AU 211