Home » Blog » Ethical Firestorm Uk Police Expand Facial Recognition Linked To Battle Tested Tech In Gaza
Ethical Firestorm: UK Police Expand Facial Recognition Linked to Battle-Tested Tech in Gaza

Ethical Firestorm: UK Police Expand Facial Recognition Linked to Battle-Tested Tech in Gaza

Jan 28, 2026 | 👀 12 views | 💬 0 comments

A major controversy has erupted over the UK government's decision to expand the use of AI facial recognition (LFR), as investigative reports link the technology to surveillance systems currently deployed by the Israeli military in the Gaza Strip.

The backlash follows the Home Office’s announcement this week to increase the number of mobile facial recognition vans from 10 to 50 nationwide, even as the High Court hears a landmark legal challenge against the Metropolitan Police’s use of the technology.

1. The Corsight Connection: From Gaza to Essex
At the center of the storm is Corsight AI, an Israeli surveillance firm headquartered in Tel Aviv. Reports from Byline Times and Action on Armed Violence (AOAV) have confirmed that technology developed by Corsight is being used by UK law enforcement through third-party contractors.


The Gaza Link: Corsight’s "Facial Intelligence" software has been reportedly used by the Israeli Defense Forces (IDF) at military checkpoints in Gaza. The system was used to catalogue Palestinian civilians fleeing conflict zones, leading to allegations of "automated apartheid" by groups like Amnesty International.

UK Adoption: Essex Police has been identified as a primary user of the tech, integrated via the UK contractor Digital Barriers. While the force claims the software is essential for catching serious criminals, critics argue that the tech being used on British streets was "refined" in an active war zone.

The "Fortify" System: Corsight’s flagship system, Fortify, is designed to identify individuals in extreme conditions—low light, partial facial coverage (masks), and from drones—capabilities highly valued in both military and urban policing scenarios.

2. National Expansion Amid High Court Challenge
Despite the ethical outcry, the UK government is doubling down on AI-driven policing. Home Secretary Shabana Mahmood defended the expansion on Monday, January 26, calling it a "transformative tool" for public safety.

The 50-Van Fleet: The government is moving to quintuple its fleet of facial recognition vans. In 2025 alone, the Metropolitan Police scanned over 4 million faces, resulting in approximately 800 arrests.

High Court Legal Action: On January 27, 2026, a legal challenge brought by Big Brother Watch and youth worker Shaun Thompson (who was misidentified by the tech) reached the High Court. The claimants argue that the surveillance is "arbitrary" and turns every public street into a "digital lineup."

Permanent Surveillance: The UK’s first permanent facial recognition cameras went live in Croydon, South London, in late 2025, marking a shift from temporary event-based scanning to constant urban monitoring.

3. The "Bias" Controversy: Hidden Reports Surface
Adding fuel to the fire, leaked documents from the National Physical Laboratory (NPL) in late 2025 revealed that the Home Office was aware of significant racial and gender bias in its systems for over a year.

The Inaccuracy Gap: The NPL report found that "retrospective" searches (comparing a suspect's photo to the police database) were significantly more likely to produce false matches for women, Black people, and those under 40.

Lobbying for Bias: The Guardian reported that police chiefs successfully lobbied to keep using less accurate, biased versions of the software because "higher confidence thresholds" (which reduce bias) were producing "fewer investigative leads."

4. Financial and Reputational Fallout
The link to the war in Gaza has begun to affect the corporate stability of the tech providers involved.

Management Crisis: Reports indicate a "mass exodus" of senior staff at Corsight AI throughout 2025, with industry insiders suggesting that institutional investors are pulling back due to the reputational risk of being associated with military operations in contested territories.

Public Trust: Human rights expert Rasha Abdul Rahim stated that it is "utterly shameful" for the UK to use tools tested on Palestinians to surveil British citizens, warning that it erodes the fundamental principle of policing by consent.

Perspective: While the Met Police insists that biometric data is "automatically deleted" if no match is found, campaigners point to the Strategic Facial Matcher—a new national system that will soon link facial recognition to immigration and custody records, creating what they call a "permanent digital shadow" for every citizen.

🧠 Related Posts


💬 Leave a Comment