HomeEthics & Society17 individuals in London arrested after AI facial recognition operation

17 individuals in London arrested after AI facial recognition operation

Last week, in south London, the Metropolitan Police used live facial recognition cameras to help within the arrest of 17 individuals.

The arrests occurred during specific operations conducted on March nineteenth and twenty first in Croydon and March twenty first in Tooting. 

Among those arrested was a 23-year-old man caught possessing two rounds of blank ammunition. This led police to seize ammunition, stolen mobile phones, and cannabis from property linked to that individual. The facial recognition system targeted him on account of an excellent warrant for his arrest. 

Currently, the technology is designed to discover individuals featured on a “bespoke watchlist,” which incorporates individuals with outstanding arrest warrants. The Metropolitan Police says this technology enables them to execute “precision policing.”

This follows a previous tally of 42 arrests made in February using the identical technology, though it stays unclear how a lot of those arrested have been charged, as per inquiries made by BBC News.

Arrests covered a broad spectrum of offenses, including sexual offenses, assault, theft, fraud, burglary, racially aggravated harassment, and breaches of anti-social behavior order conditions.

Following the operation, the police stated they offered communities “information and reassurance” during these surveillance operations. 

🧵| Live Facial Recognition tech is a precise community crime fighting tool. Led by intelligence, we place our effort where it’s prone to have the best effect. pic.twitter.com/N5bKwzAEI1

Facial recognition in policing is controversial

Last yr, members of the UK’s House of Lords and Commons wanted the police to reevaluate live facial recognition technology after the policing minister hinted at police forces having access to a database of 45 million images from passports.

Michael Birtwistle from the Ada Lovelace Institute argued, “The accuracy and scientific basis of facial recognition technologies is very contested, and their legality is uncertain.”

Civil rights advocacy group Big Brother Watch also highlighted that 89% of UK police facial recognition alerts fail.

Lindsey Chiswick, the Metropolitan Police intelligence director, sought to dispel privacy concerns. She told the BBC, “We don’t keep your data. If there isn’t any match, your data is straight away and routinely deleted in seconds.” Chiswick also asserted that the technology has been “independently tested” for reliability and bias. 

Others contest that. For instance, Madeleine Stone from Big Brother Watch expressed concerns about AI surveillance, labeling it “Orwellian.” 

Stone continued, “Everyone wants dangerous criminals off the road, but papering over the cracks of a creaking policing system with intrusive and Orwellian surveillance technology shouldn’t be the answer. Rather than actively pursuing individuals who pose a risk to the general public, law enforcement officials are counting on probability and hoping that wanted people occur to walk in front of a police camera.”

How the UK police use AI facial recognition

The UK police force began testing facial recognition technologies in 2018, deploying camera-equipped vans to capture footage from public places.

An example of a facial recognition van at an early trial in central London. Source: X

We don’t know exactly how these systems work behind the scenes. A recent Freedom of Information request directed on the Metropolitan Police Service (MPS) sought clarification about exactly how the police use AI technology. 

The MPS disclosed using AI technologies like Live Facial Recognition (LFR) and Retrospective Facial Recognition (RFR) inside specific operations.

However, the MPS refused to answer the majority of the inquiry, citing exemptions under the Freedom of Information Act 2000 related to “national security,” “law enforcement,” and “the protection of security bodies.”

Specifically, the MPS argued that divulging details concerning the covert use of facial recognition could compromise law enforcement tactics.

The response states: “Confirming or denying that any information referring to any possible covert practice of Facial Recognition would show criminals what the capability, tactical abilities, and capabilities of the MPS are, allowing them to focus on specific areas of the UK to conduct/undertake their criminal/terrorist activities.”

Lessons from the past

While predictive policing was designed to make communities safer, it’s led to some troubling outcomes, including the wrongful arrest of people.

These aren’t just isolated incidents but quite a pattern that reveals a critical flaw in relying too heavily on AI for police work.

Robert McDaniel in Chicago, despite having no violent history, was targeted by police as a possible threat just because an algorithm placed him on an inventory. 

His story isn’t unique. Across the US, there have been instances where people were wrongly accused and arrested based on faulty facial recognition matches. 

Nijeer Parks’s story is a stark example. Accused of crimes he had nothing to do with, Parks faced jail time and a hefty legal bill – all due to an incorrect match by facial recognition technology.

Facial recognition technology has been repeatedly outed as inaccurate for darker-skinned individuals, particularly black women. While facial recognition for white faces could be accurate in over 90% of cases, it will possibly be as low as 35% for black faces. Marginalized groups have essentially the most to lose from inaccurate algorithmic policing strategies. 

Wrongful arrests aren’t just distressing for those directly involved; additionally they forged a protracted shadow over the communities affected. 

When persons are arrested based on predictions quite than concrete actions, it shakes the very foundation of trust between law enforcement and the general public. 

Indeed, public trust within the police is at all-time low each within the UK and the US. AI threatens to erode this further if poorly managed. 

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read