London police deploy facial recognition tech, stirring privacy fears

Published February 11, 2020
Rights campaigner Silkie Carlo demonstrates in front of a mobile police facial recognition facility outside a shopping centre in London on Tuesday, Feb 11. — AP
Rights campaigner Silkie Carlo demonstrates in front of a mobile police facial recognition facility outside a shopping centre in London on Tuesday, Feb 11. — AP

London police started using facial recognition cameras on Tuesday to automatically scan for wanted people, as authorities adopt the controversial technology that has raised concerns about increased surveillance and erosion of privacy.

Surveillance cameras mounted on a blue police van monitored people coming out of a shopping centre in Stratford, in east London. Signs warned that police were using the technology to find people wanted for serious crimes. Officers stood nearby, explaining to passers-by how the system works.

It's the first time London's Metropolitan Police Service has used live facial recognition cameras in an operational deployment since carrying out a series of trials that ended last year.

London police are using the technology despite warnings from rights groups, lawmakers and independent experts about a lack of accuracy and bias in the system and the erosion of privacy. Activists fear it's just the start of expanded surveillance.

"We don't accept this. This isn't what you do in a democracy. You don't scan people's faces with cameras. This is something you do in China, not in the UK,” said Silkie Carlo, director of privacy campaign group Big Brother Watch.

Britain has a strong tradition of upholding civil liberties and of not allowing police to arbitrarily stop and identify people, she said. "This technology just sweeps all of that away."

Police Commander Mark McEwan downplayed concerns about the machines being unaccountable. Even if the computer picks someone out of a crowd, the final decision on whether to investigate further is made by an officer on the ground, he said.

"This is a prompt to them that that's somebody we may want to engage with and identify," he said.

London's system uses technology from Japan's NEC to scan faces in the crowds to see if they matched any on a watchlist of 5,000 faces created specifically for Tuesday's operation.

The watchlist images are mainly of people wanted by the police or courts for serious crimes like attempted murder, said McEwan.

London police say that in trials, the technology correctly identified 7 in 10 wanted people who walked by the camera while the error rate was 1 in 1,000 people. But an independent review found only eight of 42 matches were verified as correct.

Police are "using the latest most up-to-date algorithm we can get,” McEwan said. "We're content that it has been independently tested around bias and for accuracy. It's the most accurate technology available to us."

Opinion

Editorial

Football elections
17 Nov, 2024

Football elections

PAKISTAN football enters the most crucial juncture of its ‘normalisation’ era next week, when an Extraordinary...
IMF’s concern
17 Nov, 2024

IMF’s concern

ON Friday, the IMF team wrapped up its weeklong unscheduled talks on the Fund’s ongoing $7bn programme with the...
‘Un-Islamic’ VPNs
Updated 17 Nov, 2024

‘Un-Islamic’ VPNs

If curbing pornography is really the country’s foremost concern while it stumbles from one crisis to the next, there must be better ways to do so.
Agriculture tax
Updated 16 Nov, 2024

Agriculture tax

Amendments made in Punjab's agri income tax law are crucial to make the system equitable.
Genocidal violence
16 Nov, 2024

Genocidal violence

A RECENTLY released UN report confirms what many around the world already know: that Israel has been using genocidal...
Breathless Punjab
16 Nov, 2024

Breathless Punjab

PUNJAB’s smog crisis has effectively spiralled out of control, with air quality readings shattering all past...