The Independent - 04.03.2020

(Romina) #1

The most recent, in Oxford Circus on 27 February, saw 8,600 faces scanned by cameras mounted on a
police van.


Official figures show that the software generated eight alerts of potential matches to a watchlist of almost
7,300 criminals. But seven were incorrect, and five innocent people were questioned by police officers after
being wrongly identified.


One person, a 35-year-old woman wanted for failing to appear at court over a serious assault on an
emergency service worker, was arrested after a correct match.


At a previous deployment, in Stratford on 11 February, 4,600 faces were scanned but no alerts were
generated from a watchlist of more than 5,800 offenders. An attempted deployment at Oxford Circus on 20
February was stopped because of an unspecified technical fault.


The figures were released after the Metropolitan Police commissioner said the force was using facial
recognition in a “proportionate, limited way that stores no biometric data”.


“We believe this has the support of the public and a very strong legal basis,” Dame Cressida Dick said. “The
only people who benefit from us not using lawfully and proportionately are the criminals, the rapists, the
terrorists and all those who want to harm you, your family and friends.”


Speaking at the Royal United Services Institute for Defence and Security Studies last week, Dame Cressida
said the technology used by her force was not racially biased. But she called on the government to bring in
specific laws on facial recognition so the public could be fully consulted.


Scotland Yard ran trials of facial recognition software over eight years that resulted in eight arrests in total
and generated controversy after people were scanned unawares and stopped for covering their faces.


Critics have questioned its effectiveness and raised human rights objections, claiming that members of the
public were entering scanning zones before seeing police signs warning that facial recognition was in
operation.


Silkie Carlo, director of Big Brother Watch, said: “It’s alarming to see biometric mass surveillance being
rolled out in London. Never before have citizens been subjected to identity checks without suspicion, let
alone on a mass scale ... the cost to our liberties, let alone the public purse, is unacceptably high.”


Scotland Yard previously said every deployment would be “bespoke” and target lists of wanted offenders or
vulnerable missing people. Assistant Commissioner Nick Ephgrave said facial recognition software “makes
no decisions” alone, and works by flagging potential facial matches from live footage to the police database.
Officers then judge whether the person could be the same and decide whether to question them in order to
establish their identity.


Mr Ephgrave claimed police had been given a “strong legal mandate” to use facial recognition by a legal
challenge that ruled that South Wales Police had used the technology lawfully.


But the landmark case is being appealed and only assessed two specific deployments, and the information
commissioner warned that it “should not be seen as a blanket authorisation for police forces to use
LFR [live facial recognition] systems in all circumstances”.


Issuing a legal opinion in October, Elizabeth Denham said: “A high statutory threshold that must be met to
justify the use of LFR, and demonstrate accountability, under the UK’s data protection law.” She added: “It
will be necessary to show a justification for why the intrusion into the privacy of large numbers of
individuals going about their lawful business is proportionate.”


The Metropolitan Police said it could not give journalists its budget for the technology, after The
Independent revealed it spent more than £200,000 on initial trials that resulted in no arrests.

Free download pdf