BBC London Home Affairs Correspondent

A man who is bringing a High Court challenge against the Metropolitan Police after live facial recognition technology wrongly identified him as a suspect has described it as “stop and search on steroids”.
Shaun Thompson, 39, was stopped by police in February last year outside London Bridge Tube station.
Privacy campaign group Big Brother Watch said the judicial review, due to be heard in January, was the first legal case of its kind against the “intrusive technology”.
The Met, which announced last week that it would double its live facial recognition technology (LFR) deployments, said it was removing hundreds of dangerous offenders and remained confident its use is lawful.
LFR maps a person’s unique facial features, and matches them against faces on watch-lists.
Last month, the Met said it had made more than 1,000 arrests since January 2024 using the technology, including alleged paedophiles, rapists and violent robbers, of which 773 had led to a charge or a caution.
It said since January 2025, there had been 457 arrests and seven false alerts.

But Mr Thompson said his experience of being stopped had been “intimidating” and “aggressive”.
“Every time I come past London Bridge, I think about that moment. Every single time.”
He described how he had been returning home from a shift in Croydon, south London, with the community group Street Fathers, which aims to protect young people from knife crime.
As he passed a white van, he said police approached him and told him he was a wanted man.
“When I asked what I was wanted for, they said, ‘that’s what we’re here to find out’.”
He said officers asked him for his fingerprints, but he refused, and he was let go only after about 30 minutes, after showing them a photo of his passport.
Mr Thompson says he is bringing the legal challenge because he is worried about the impact LFR could have on others, particularly if young people are misidentified.
“I want structural change. This is not the way forward. This is like living in Minority Report,” he said, referring to the science fiction film where technology is used to predict crimes before they’re committed.
“This is not the life I know. It’s stop and search on steroids.
“I can only imagine the kind of damage it could do to other people if it’s making mistakes with me, someone who’s doing work with the community.”
‘Intrusive’ or ‘making London safer’?
Big Brother Watch, whose director Silkie Carlo is bringing the legal challenge alongside Mr Thompson, said it was the first time that a misidentification case had come before the High Court.
“It’s a really intrusive new power, absent of any democratic scrutiny,” said Madeleine Stone, senior advocacy officer. “There are no specific laws on the use of facial recognition, they’re really writing their own rules on how they use it.
“Shaun’s legal challenge is such an important opportunity for the government and the police to take stock of how this technology is spreading across London in a really unaccountable fashion.”

A Met Police spokesperson said that the force was unable to provide a full comment on Mr Thompson’s case due to ongoing proceedings, but that it was confident its use of LFR was lawful.
“We continue to engage with our communities to build understanding about how this technology works, providing reassurances that there are rigorous checks and balances in place to protect people’s rights and privacy.”
In July, the Home Office said it would be setting out its plans for the future use of LFR in the coming months, including a legal framework and safeguards.
The Met is planning to double its use of LFR to up to 10 times a week across five days, up from the current four times a week across two days.
Making the announcement last week, Commissioner Sir Mark Rowley told me the technology was “making London safer”.
“There are a lot of wanted offenders out there. This helps us round them up.”
He pointed to a case where a registered sex offender, who’d been banned from being alone with young children, was picked up on LFR cameras in Denmark Hill, south-east London, in January with a six-year-old girl.
“I think most people would expect two things. Is it accurate? Is it fair? We’ve worked really hard on this.”
The Met has said it works within existing human rights and data protection laws and its approach has been tested by the National Physical Laboratory, to check there is no gender or racial bias.
“If you’re not wanted, that image is deleted straight away,” the commissioner told me.
“The council CCTV you walk past, or shop CCTV, they keep your face for 28 days. We keep your face for less than a second. Unless you’re wanted, in which case, we arrest you.”

The Met has also announced plans to use LFR on the approach to and from this year’s Notting Hill Carnival over the August Bank Holiday, although not within the boundaries of the event.
It says the move will protect Carnival against a “tiny minority of individuals intent on causing serious harm to others” and the watchlist will include wanted suspects, as well as missing people “who may be at risk of criminal or sexual exploitation”.
From September, a pilot scheme is due to be trialled in Croydon, south London, where fixed cameras will be mounted on street furniture, instead of being used by a team in a mobile van.
The force says the cameras will only be switched on when officers are using LFR in the area.
‘An algorithm and a camera’
“The Met Police has ramped up LFR in an extraordinary fashion over the last couple of years, in a way that raises questions around racial discrimination and the communities they’re targeting”, said Madeline Stone.
The force has said its use of the technology is data led and based on crime hotspots, and that it comes alongside boosting neighbourhood policing in those areas.
However, it says it having to make savings due to “real term funding cuts” and expecting to lose around 1,700 officers, PCSOs and staff by the end of the year.
“They’re replacing a police officer, who might know their community, with an algorithm and a camera,” said Ms Stone.
Mr Thompson told me he was concerned about the deployment of LFR in communities where there were already low levels of trust in policing and contrasted this approach with the work being done by Street Fathers.
“We speak to the kids, we make time for the children, we put ourselves in hotspots.”
He said parents regularly called the group asking them to remove weapons they had discovered in their children’s bedrooms.
“We are doing a role where the community trusts us, and not the police. We don’t need machines for that. It’s about human contact.”
www.bbc.com
#Facial #recognition #tech #mistook #wanted #man