New York Magazine - USA (2019-11-11)

(Antfer) #1
32 newyork| november11–24, 2019

the 12 that make up the Knickerbocker Village complex on the
Lower East Side. Some virtual-doorman systems include it too.
At least eight public-school districts in the U.S. have installed
facial-recognition systems to detect suspended students, sex offend-
ers, and anyone else who is banned from school grounds.
Retailers rely on facial recognition to prevent theft, and some
software even comes bundled with databases of known shoplifters,
but not many stores will admit to implementing it. Last year, the
ACLU asked 20 top retail chains, including Best Buy, Costco, Tar-
get, and Walmart, whether they use facial recognition in their stores
and received only two answers—from the hardware chain Lowe’s,
which acknowledged testing the technology, and the grocery con-
glomerate Ahold Delhaize, whose brands include Food Lion and
Stop & Shop, which said it hadn’t.
Retailers don’t necessarily need to run facial-recognition soft-
ware on their premises to benefit from it, though. Apple denies
using the technology at its brick-and-mortar locations, but in one
confusing incident last year, a New York teenager was arrested and
charged with stealing from multiple Apple Stores when police said
he was identified by facial recognition. Apple apparently employs
an outside firm called Security Industry Specialists in some loca-
tions; SIS may have run facial-recognition software after the fact on
surveillance footage captured inside the Apple Stores the teen was
alleged to have stolen from. (Charges were dropped against him
when an NYPD detective realized he looked nothing like the sus-
pect in the footage of the robberies.)
Facial recognition may soon be more valuable to retailersin other
ways. One likely possibility is that some will eliminate checkout
lines by having customers pay with their faces. Another is using
facial recognition to target the people most likely to buy things by
tracking their in-person shopping habits the same way cookies track
our online ones—or maybe, eventually, using what they know about
our virtual selves to direct us toward products in the real world. If
you’ve ever been creeped out by an uncannily well-targeted ad
served to you on Facebook or Instagram,
imagine being helped by a retail employee
who knows what’s in your web history.

HAT IF YOU’RE NOT the person facial recog-
nition says you are? Last year, the ACLU
used Amazon’s facial-recognition software to
search a mug-shot database against photos
of members of Congress and found that it misidentified 28 lawmak-
ers as criminal suspects, including six members of the Congressio-
nal Black Caucus, and a similar test this summer misidentified 26
California state legislators, many people of color. Amazon says the
ACLU misrepresented its software and recommends policeact only
onmatchesinwhichitssystemexpressesat least 99percent confi-
dence—butthere’s nothingtopreventpolicedepartmentsfrom do-
ingthesamething.“It’s toothless,” saysJacobSnow, thetechnology-
and-civil-libertiesattorneyat theACLUwhoranthetests.“Amazon
couldsaytolawenforcement,‘We’regoingtoset theconfidence
thresholdat 99percent,andyoucan’t change it.’ Butthey’re not
doingthat.”
Raceisn’t facialrecognition’sonlyblindspot.In itstests, the
NationalInstituteofStandardsandTechnology(nist)found that
eventopalgorithmshadtroubleidentifyingphotosof thesame per-
sonatdifferentagesandwere oftenunabletotellthedifference
betweentwins—notjust identicaltwinsbutfraternalones of the
samesex,too.Andperformancedependsontheclarity of the photos
beingused.nistwasprimarilycomparinghigh-quality mug shots

with high-quality mug shots, but under real-world conditions, with
blurry surveillance photos taken at bad angles by cameras that may
have been set up incorrectly, results may vary.
During six recent tests of the London police’s facial-recognition
system, which scanned the faces of people on public streets in search
of wanted suspects, 42 matches were made but only eight verified
to be correct. (Thirty matches were eventually confirmed to be mis-
identifications, and four of the 42 people disappeared into crowds
before officers were able to make contact.) Because they scanned
thousands of faces in total, the London police said their error rate
was 0.1 percent, but most headlines begged to differ: london
police facial recognition fails 80% of the time, said one.
Police have also been caught taking creative license with the tech-
nology. A report published in May by Georgetown University’s Cen-
ter on Privacy and Technology found that six police departments in
the U.S. allow officers to run composite sketches of suspects through
facial-recognition software. That same report tells the story of a
suspect who had been caught by surveillance cameras allegedly
stealing beer from a CVS in Gramercy Park in 2017, but the image
quality was poor so no useful matches were returned. A detective
noticed, though, that the man bore a minor resemblance to Woody
Harrelson, so he ran a search using an image of the actor—which
eventually led to an arrest. (NYPD spokeswoman Devora Kaye
noted that the arrest was “just one of more than 5,300 requests to
the Facial Identification Section that year.” “A facial recognition
match is a lead,” Kaye added. “No one has ever been arrested solely
on the basis of a computer match, no matter how compelling.)
If Larry Griffin II’s story represents a best-case use, Kaitlin Jack-
son, a public defense attorney in the Bronx, tells me another. Jack-
son represented a man who’d been charged with stealing socks from
a T.J. Maxx, supposedly after brandishing a box cutter. “My client
was picked up months after the robbery, and the only way I even
found out facial recognition was used was that I just started calling
the prosecutor and saying, ‘How in the world did you decide months
after that it was my client? There are no forensics,’ ” she says. “It
turned out the police went to T.J. Maxx security and said, ‘We want
to pull the surveillance; we’re going to run it through facial recogni-
tion’. And then they texted the guard a single photo that he knows
has been run through facial recognition, and they said, ‘Is this the
person?’ That’s the most suggestive procedure you could possibly
imagine.” (The NYPD said the defendant had committed an earlier
theft at the same store and that the security guard knew him “from
prior interactions.” The detective on the case showed him an image
hoping it would “put a name to a face,” the department said.)
Jackson’s client had at least two lines of defense: He has a twin
brother, who could have triggered the facial-recognition match
(although Jackson doesn’t think the twin stole any socks either), but
more important, his partner was in labor at the time of the sock
theft and he was in the delivery room. “We had pictures of them at
the hospital, and his name was on the birth certificate,” says Jackson.
“But because of the prosecution and police department’s undying
faith that the software doesn’t get it wrong, they stuck with this
insane position: ‘Maybe he left a few minutes before his baby was
delivered and ran out to get socks and then came back.’ ”
Jackson says her client spent half of last year in jail. “He was on
probation when he was arrested. So our real problem was the way
that all these systems interact. Probation lodged a hold, and they
would not withdraw the hold because of this case, and the prosecu-
tion wouldn’t dismiss this case. And then finally [the prosecution]
offered him something that would get him out of jail. So he did what
a lot of us would—he took a plea of something he did not do.” ■

W

Free download pdf