UK Police Prioritize Leads Over Fairness With Biased Tech

UK Police Prioritize Leads Over Fairness With Biased Tech

A significant and contentious debate is unfolding across the United Kingdom as law enforcement agencies deliberately continue to use facial recognition technology that is acknowledged to be fundamentally biased against women and ethnic minorities. Police leadership has successfully advocated for the continued use of this flawed system, framing the decision as a necessary trade-off between operational effectiveness and absolute fairness. This choice has ignited a fierce public discourse, placing the stated goal of public safety in direct conflict with the foundational principles of civil liberties and equal treatment under the law, raising profound questions about the acceptable cost of technological advancement in policing.

A System of Acknowledged Flaws

The technology at the center of this controversy is a retrospective facial recognition tool designed to search the vast Police National Database (PND), which holds more than 19 million custody photographs. A definitive review, commissioned by the Home Office and conducted by the National Physical Laboratory (NPL), provided undeniable evidence of the system’s inherent flaws. Police leaders were briefed on the NPL’s findings in September 2024, which confirmed that the algorithm exhibited significantly higher misidentification rates for specific demographic groups. It was found to be far more likely to generate incorrect matches for women, Black and Asian individuals, and anyone aged 40 and under. A subsequent, more recent NPL study quantified this alarming disparity, revealing that under certain operational settings, the rate of false positives for Black women could be nearly 100 times higher than for white women, creating a substantial risk of wrongful suspicion for already over-policed communities.

In an initial attempt to address these well-documented issues, the National Police Chiefs’ Council (NPCC) mandated an increase in the confidence threshold required for the system to flag a potential match. Internal NPCC documents confirm that this adjustment was effective from a fairness standpoint, with one report stating, “The change significantly reduces the impact of bias across protected characteristics of race, age and gender.” However, this correction came at a significant operational price. The number of searches that produced viable investigative leads plummeted from a robust 56% to a mere 14%. Following complaints from police forces that “a once effective tactic returned results of limited benefit,” the NPCC reversed its decision after just one month. This swift reversal returned the system to its original, lower threshold, knowingly reintroducing the discriminatory outcomes in exchange for a higher volume of potential leads and clearly prioritizing operational utility over the mitigation of racial and gender bias.

Justification Versus Condemnation

Government officials and senior police leaders staunchly defend their decision to revert to the biased algorithm, arguing it is an essential tool for maintaining public safety. Chief Constable Amanda Blakeman, the NPCC lead for the PND, framed the choice as a difficult but necessary “balance that must be struck” in order to “best protect the public from those who could cause harm.” Echoing this sentiment, a Home Office spokesperson described the technology as “game-changing” for incarcerating “criminals and rapists,” reinforcing that their primary “priority is protecting the public.” Proponents also point to the critical role of human oversight, asserting that all potential matches are carefully reviewed by trained officers before any action is taken. Policing Minister Sarah Jones has championed the technology, hailing it as the “biggest breakthrough since DNA matching,” suggesting that its benefits, even with acknowledged flaws, outweigh the risks when mitigated by human intervention.

This official justification is starkly contrasted by the views of independent experts and oversight bodies, who see the decision as a reckless disregard for fundamental civil rights. Professor Pete Fussey, a former independent reviewer of the technology, directly challenged the ethical rationale, asking “whether facial recognition only becomes useful if users accept biases in ethnicity and gender.” He dismissed the argument for its continued use as one of “convenience,” which he contends is a “weak argument for overriding fundamental rights” and is “unlikely to withstand legal scrutiny.” An even sharper critique came from Abimbola Johnson, chair of the police race action plan’s independent scrutiny board. She warned that deploying such technologies within a policing landscape already defined by “racial disparities, weak scrutiny and poor data collection” is a formula for compounding existing inequities and betraying institutional commitments to anti-racism.

A Precarious Path Forward

Despite the persistent controversy and vocal opposition, the government showed no signs of curtailing its reliance on facial recognition. Instead, it launched a 10-week public consultation to discuss plans for the technology’s even wider deployment. In an apparent effort to address the performance issues of the current system, the Home Office announced the procurement of a “new algorithm.” This replacement system had purportedly undergone independent testing and was found to have “no statistically significant bias.” Plans were made for this new algorithm to undergo further testing early in the year, followed by a thorough evaluation. This strategy suggested a dual approach: building public support for broader implementation while simultaneously working to replace the technically and ethically compromised algorithm currently in operation. However, critics remained deeply skeptical, emphasizing that a technological fix alone was insufficient without the parallel implementation of robust, independent oversight and a fundamental shift in institutional priorities away from achieving effectiveness at any cost.

Subscribe to our weekly news digest.

Join now and become a part of our fast-growing community.

Invalid Email Address
Thanks for Subscribing!
We'll be sending you our best soon!
Something went wrong, please try again later