Man misidentified by AI says encounter felt like “stop and search on steroids”
Shaun Thompson wasn’t doing anything unusual that day in February. Just finishing his shift and heading home near London Bridge Tube station. Then the police stopped him. Said he matched the face of someone wanted. He didn’t. But the damage, he says, had already been done.
What followed wasn’t just a wrongful stop. It turned into a deep distrust — one that’s now headed for Britain’s High Court in what could become a landmark legal battle over police use of live facial recognition technology, or LFR.
A walk home turns into a nightmare
Thompson, 39, says the experience was nothing short of humiliating. He was wearing his Street Fathers jacket — the same one he wears when working to protect teens from knife crime in Croydon.
He’s used to being on the side trying to keep young men out of trouble, not being mistaken as someone in it.
“I was just walking. That’s it,” he told the BBC. “Next thing I knew, I was surrounded, being spoken to like I’d done something terrible.”
One officer told him he was being stopped because facial recognition tech had flagged him. He didn’t understand what that meant. Then came the questions. The stares. The awkward silence as police realized they had the wrong man.
Surveillance vs. safety: Where’s the line?
The Metropolitan Police say the technology helps them catch criminals fast — the kind of people the public would want off the streets.
They’ve made over 1,000 arrests since January 2024 using LFR. Of those, 773 led to charges or cautions, according to figures published last month. In 2025 alone, 457 arrests were reported with just seven false matches, the Met claims.
So what’s the problem?
Shaun Thompson — and groups like Big Brother Watch — say those numbers don’t tell the full story. They say people like Shaun are the collateral damage in a system that doesn’t always get it right.
“Even one mistaken identity is one too many,” said a spokesperson for the campaign group. “This is surveillance on a mass scale, using deeply flawed technology. And it’s happening in real time, on our streets.”
Big Brother Watch backs first major UK legal challenge
Thompson’s case is now heading for a judicial review in January, the first of its kind to directly challenge how LFR is used in public spaces.
The tech itself works like this: cameras scan a crowd. The system picks out faces and checks them against a watchlist — suspects, wanted individuals, persons of interest. If there’s a match, police nearby are alerted and act.
But critics argue it’s not foolproof. Algorithms can misread people. Lighting, movement, race, and even facial hair can throw things off. The result? Innocent people being pulled aside, questioned, and recorded.
“This feels like stop and search on steroids,” Thompson said bluntly.
The Met doubles down, not backing off
Despite the growing pushback, the Met Police say they’re expanding the use of the tech, not scaling it back.
Just last week, they confirmed plans to double the number of LFR deployments across London. Their argument is simple: it works.
A Met Police spokesperson said the force remains confident that the use of LFR is both “lawful and proportionate,” and that it helps take “dangerous individuals off the streets.” They stress that it’s used only in specific locations and always in conjunction with officers on the ground.
Still, not everyone’s convinced.
Here’s what we know so far, in plain numbers:
Year | Arrests Made | Charges/Cautions | False Alerts |
---|---|---|---|
2024 (Jan–Dec) | 1,000+ | 773 | Not disclosed |
2025 (Jan–Now) | 457 | Not released | 7 |
Public trust and trauma
What really sticks with Thompson isn’t just the embarrassment. It’s the memory. The unease. The feeling of being watched, wrongly accused, and powerless.
“Every time I walk past that spot,” he said, “it comes back.”
He’s not the only one worried. Local community workers say young men of color are especially anxious about facial recognition being used in high-traffic areas — often places they have to pass through every day.
One paragraph. One sentence.
This, some say, isn’t just about tech. It’s about power — who holds it, who’s affected by it, and who’s believed when something goes wrong.
A bitter irony
There’s something almost poetic, in the worst way, about Shaun Thompson being stopped that night. He wasn’t just a random guy walking home. He was out there doing the work — the same work police say they’re trying to support.
He’s been mentoring teens in south London for years. Trying to help them stay out of gangs, away from violence, and off the streets at night.
But in a few seconds, he was the one being scrutinized.
-
He wasn’t charged.
-
He wasn’t arrested.
-
But the moment stuck — like a scar you can’t see.*
Now he’s fighting back. Through the courts. Not to make headlines, he says, but to make sure no one else has to explain to their kid why police just yanked them off the sidewalk.