A Knock on the Door That Changed Everything
Alvi Choudhury was working from his family home in Southampton in January when police officers from Thames Valley arrived unannounced, placed him in handcuffs, and transported him to a station nearly 100 miles away. The charge: a burglary in a city he had never set foot in. The evidence: a facial recognition match that, according to Choudhury's legal team, confused him with a suspect who appeared roughly 10 years younger.
Choudhury, a 26-year-old software engineer of South Asian heritage, spent nearly 10 hours in police custody before being released at 2 a.m. with no charges filed. He is now pursuing damages against Thames Valley Police, arguing that the biased technology that led to his arrest represents a systemic failure in how UK law enforcement deploys automated surveillance.
How Facial Recognition Misidentification Happens
Facial recognition systems work by mapping geometric features of a face — the distance between eyes, the shape of a jawline, the contour of cheekbones — and comparing those measurements against databases of known individuals. While the technology has improved dramatically in recent years, independent audits have consistently found higher error rates when identifying people of color, women, and younger individuals.
A landmark 2019 study by the National Institute of Standards and Technology (NIST) found that many commercial facial recognition algorithms exhibited error rates 10 to 100 times higher for Black and Asian faces compared to white faces. Despite these findings, police forces across the UK have continued to expand their use of the technology, often with minimal public oversight.
In Choudhury's case, the system apparently failed to distinguish between two men of South Asian descent despite a significant age difference and the fact that they lived in different parts of the country. His lawyers argue this is precisely the kind of failure that civil liberties organizations have warned about for years.








