New Scientist - USA (2019-10-12)

(Antfer) #1
After all, a metal man and a
minibar are not so dissimilar.
At other times – crossing
from “coffee” to “poultry”, for
example – the division between
categories is sharp, leaving me
unsure how we moved from one
to another, and whose decision it
was. Was some algorithm making
an obscure connection between
hens and beans?
Well, no: the categories were
chosen and arranged by Paglen.
Only the choice of images within
each category was made by a
trained neural network.
This set me wondering
whether the ImageNet data set
wasn’t simply being used as a
foil for Paglen’s sense of mischief.
Why else would a cheerleader
dominate the “saboteur”
category? And do all “divorce
lawyers” really wear red ties?
This is a problem for art built
around artificial intelligence:
it can be hard to tell where the
algorithm ends and the artist
begins. Mind you, you could say
the same about the entire AI field.

“A lot of the ideology around AI,
and what people imagine it can do,
has to do with that simple word
‘intelligence’,” says Paglen, a US
artist now based in Berlin, whose
interest in computer vision
and surveillance culture sprung
from his academic career as a
geographer. “Intelligence is the
wrong metaphor for what we’ve
built, but it’s one we’ve inherited
from the 1960s.”
Paglen fears the way the word
intelligence implies some kind
of superhuman agency and

12 October 2019 | New Scientist | 31

“ The group labelled


‘bottom feeder’ consists
entirely of headshots,
there isn’t one aquatic
creature in evidence”

Don’t miss


Visit
Nam June Paik
predicted the internet in
baffling, alluring art. More
than 200 of his works,
from robot sculptures
to giant installations,
are on show at this
eponymous exhibition
at London’s Tate Modern
from 17 October.

Watch
Living With Yourself,
a new sci-fi comedy on
Netflix from 18 October,
finds Paul Rudd (Marvel’s
Ant-Man) struggling to
wrest his life and family
away from a better
version of himself,
after a spa treatment
over-delivers on its
promise of a “new you”.

Read
The Consequential
Frontier: Challenging
the privatization of
space by Peter Ward
(Melville House) casts
a critical eye over the
commercial works of
today’s space tycoons
and argues for public
ownership of space.
PL

AN
ET
PH

OT

OS
/W

HIT

NE
Y^ M

US

EU
M^ O

F^ A

ME

RIC

AN
AR

T,^ N

EW

YO

RK

.^ PU


RC
HA

SE
D,^ W

ITH

FU

ND

S^ F

RO

M^ D

IET

ER
RO

SE
NK
RA

NZ

Find out the role of people in an age of AI from
Joanna Bryson on 13 October at New Scientist Live
newscientistlive.com

infallibility to what are in essence
giant statistical engines. “This
is terribly dangerous,” he says,
“and also very convenient for
people trying to raise money
to build all sorts of shoddy,
ill-advised applications with it.”
Asked what concerns him
more, intelligent machines or
the people who use them, Paglen
answers: “I worry about the
people who make money from
them. Artificial intelligence is not
about making computers smart.
It’s about extracting value from
data, from images, from patterns
of life. The point is not seeing.
The point is to make money
or to amplify power.”
It is a point by no means lost
on a creator of ImageNet itself,
Fei-Fei Li at Stanford University
in California, who, when I spoke to
Paglen, was in London to celebrate
ImageNet’s 10th birthday at
the Photographers’ Gallery. Far
from being the face of predatory
surveillance capitalism, Li leads
efforts to correct the malevolent
biases lurking in her creation.
Wong, incidentally, won’t get that

racist slur again, following
ImageNet’s announcement
that it was removing more than
half of the 1.2 million pictures of
people in its collection.
Paglen is sympathetic to the
challenge Li faces. “We’re not
normally aware of the very narrow
parameters that are built into
computer vision and artificial
intelligence systems,” he says.
His job as artist-cum-investigative
reporter is, he says, to help reveal
the failures and biases and forms
of politics built into such systems.
Some might feel that such work
feeds an easy and unexamined
public paranoia. Peter Skomoroch,
former principal data scientist
at LinkedIn, thinks so. He
calls ImageNet Roulette junk
science, and wrote on Twitter:
“Intentionally building a broken
demo that gives bad results for
shock value reminds me of
Edison’s war of the currents.”
Paglen believes, on the contrary,
that we have a long way to go
before we are paranoid enough
about the world we are creating.
Fifty years ago it was very
difficult for marketing companies
to get information about what
kind of television shows you
watched, what kinds of drinking
habits you might have or how
you drove your car. Now giant
companies are trying to extract
value from that information.
“I think,” says Paglen, “that we’re
going through something akin
to England and Wales’s Inclosure
Acts, when what had been de facto
public spaces were fenced off by
the state and by capital.”
The happy bit about this story is
how, time and again, the scandals
thrown up by “AI” turn out to have
a simple human origin. Boredom,
carelessness, malignity: we know
what to do about this. And even
as I was writing this, ImageNet
Roulette was taken down. ❚

Trevor Paglen chose photo
categories, but an AI chose
the pictures to fill them

TIM

P.^
WH

ITB

Y/G

ET
TY
IM

AG

ES
FO

R^ B

AR
CIC

AN
CE

NT

RE
Free download pdf