Are UK supermarkets the right place for facial recognition systems?
An unprecedented rise in retail crime, described by grocery leaders as a “shoplifting epidemic,” has added additional strain to an industry still recovering from a global pandemic and a cost-of-living crisis. Does facial recognition technology offer a solution, or is it a case of retailers overstepping boundaries?
Recorded shoplifting offences rose by 20% in 2024, to 516,971, marking the highest figure since current police records began in 2003, according to the latest crime report from the Office for National Statistics (ONS).
Meanwhile, crime cost retailers £4.2bn last year. That figure includes £2.2bn shoplifting losses and a further £1.8bn spent on crime prevention measures, such as CCTV, additional security personnel, anti-theft devices and body worn cameras.
While these prevention measures are now widely used across the sector, some retailers are going even further to combat crime in stores by testing facial recognition technology.
However, this tech has come under fire from campaign groups who argue that it is an invasion of shopper privacy and instead call for tougher laws. The situation has sparked debate over whether the use of facial recognition in UK supermarkets is justifiable, and whether there are viable alternatives.
Which supermarkets are using facial recognition tech?

Last month, Asda began a two month trial of Live Facial Recognition technology at five stores in Greater Manchester. The pilot will see the new technology integrated into Asda’s existing CCTV network. It works by scanning facial images and comparing the results to a list of known individuals who have previously engaged in criminal activity on an Asda site.
If a match is found by the automated system, in a process that takes seconds, a member of the Asda head office security team will conduct a remote check and provide feedback to the store in real time.
Asda non-food and retail chief commercial officer Liz Evans says: “The rise in shoplifting and threats and violence against shopworkers in recent years is unacceptable and as a responsible retailer we have to look at all options to reduce the number of offences committed in our stores and protect our colleagues.
“We consistently look for new ways to improve the security in our stores and this trial will help us understand if facial recognition technology can reduce the number of incidents and provide greater protection to everybody in our stores.”
Asda is not the first retailer to evaluate a facial recognition system.
In 2020, the Southern Co-op debuted the use of facial recognition for age verification purposes, and Iceland Foods boss Richard Walker said late last year that he would “happily” trial and use facial recognition technology in a bid to cut down on retail crime and protect his workers.
Why are people against it?
The main controversial issues surrounding the use of this technology in supermarkets are over privacy and potential bias.
According to legal advisor Sprintlaw, under the UK’s GDPR (General Data Protection Regulation) rules, facial recognition data used to identify people is subject to stricter rules and higher protection requirements than most other types of personal data. However, collecting it is legal if it is done in accordance with the law.
Asda says its trial uses its existing CCTV system and “fully complies with all data protection regulations”. However, the grocer’s trial was still met with backlash over fears from campaigners that it would encroach on shopper privacy.
According to pressure group Big Brother Watch, the supermarket chain received over 5,000 complaints over its use of live facial recognition technology. Asda has rejected this, claiming it only received 89 complaints.
However, Big Brother Watch senior advocacy officer Madeleine Stone says: “Thousands of people have used our tool to tell Asda to abandon their facial recognition trial, and the hashtag #StopAsdaSpying has been trending on social media and received widespread national media coverage.
“This should serve as a serious warning to other retailers considering introducing this deeply controversial technology that shoppers don’t want to be spied on.”
Previously, Southern Co-op also faced backlash from members of the public and Big Brother Watch following the pilot of its facial recognition software. This led to complaints being filed to the Information Commissioner’s Office (ICO), which regulates privacy laws in the UK.

At the time, Big Brother Watch labelled the convenience retailer’s initiative as “Orwellian” and “unlawful”.
For retail consultant and ex-John Lewis customer experience director Peter Cross, the backlash surrounding Asda’s trial didn’t come as a major shock.
“Customers are, by and large, reasonable and will understand a retailers need to protect both their people and their profits in the midst of what is undeniably a disturbing trend. But whilst technology may very well provide a path to progress, any solution has to be appraised on its impact on the majority over the minority,” he explains.
“I do believe that with care these emerging technologies will have a place in the overall customer experience – but the advantages to the customer will need to be explained with as much gusto as the advantages to the retailer.”
Broadcaster and consumer expert Kate Hardcastle MBE, known as The Customer Whisperer, argues that “Safety shouldn’t come at the cost of trust — shoppers want protection, not surveillance.
“The intention behind facial recognition technology is to act as a deterrent, and on paper, that sounds like progress — but we must tread carefully. While security is essential, particularly to protect front-line workers who have faced a 50% increase in assaults in recent years, any move that risks eroding consumer trust must be handled with absolute transparency,” she explains.
Meanwhile, Big Brother Watch’s Stone describes live facial recognition in shops as “disproportionate and invasive”.
According to the privacy campaigning organisation, more than 3,000 people have been wrongly identified by police facial recognition systems, and research shows that this technology can discriminate against women and people of colour.
“By subjecting people to an airport-style biometric identity check as they go about their weekly shop, retailers using this technology are treating their shoppers like suspects. This technology often makes mistakes, leading to innocent people being wrongly flagged as criminals and blacklisted from shops with no due process,” Stone explains.
Hardcastle agrees that consumers are likely to feel uneasy following headlines about AI bias, false positives, and wrongful identification.
“These concerns aren’t based on conspiracy theories — they’re rooted in real-world examples where technology has failed, often disproportionately impacting marginalised groups. Trust is fragile. Any technology that risks mislabelling, profiling, or making honest customers feel like suspects is a huge red flag.”
In fact, Hardcastle says that if shoppers feel they’re being watched too closely, or unfairly judged, “it may trigger anxiety or deter them altogether”.
“Introducing surveillance-style technology without robust communication, consent, and clarity risks alienating the very people retailers depend on. If this isn’t introduced with empathy and openness, yes — retailers could absolutely lose customers.”
What are the alternatives?

When looking to combat retail crime, Hardcastle notes that other “smart alternatives” are being used by retailers. These include enhanced staff training, improved store layouts to aid visibility, body-worn cameras for staff, and partnerships with local policing initiatives.
She suggests: “There’s also huge untapped potential in using behavioural analytics — understanding patterns of theft through movement data without identifying individuals.
“Importantly, listening to the front-line teams — those working the checkouts and aisles — is vital. They often have the sharpest insights into what works and what doesn’t. Safety doesn’t always require scanning faces — sometimes, it starts by showing yours and being present on the shop floor.”
Daniel Gabay, the CEO of computer vision AI company Trigo, which works with global retail and logistics clients, says the debate around facial recognition in retail “highlights an important tension between technological progress and the right to privacy.”
However, he points out that “innovation doesn’t have to come at the expense of individual rights”.
“With computer vision technology, it’s possible to track shopper movement anonymously – without capturing biometric data or personal identifiers. And that’s how we do it.”
Is retail the right space for facial recognition tech?
For Stone at Big Brother Watch, facial recognition is “dangerously out of control in the UK”.
She believes that retailers using facial recognition should “strip this surveillance technology out of their stores and the government must urgently step in to prevent the unchecked spread of this invasive technology.”
Hardcastle offers a slightly different view. “If the technology can be demonstrably accurate, bias-free, and respectfully integrated into the store environment with clear opt-ins and accountability, it may evolve into a reassuring layer of protection,” she considers.
However, that’s a tall order, she admits. “Shoppers need to feel that this technology is there to protect them — not just the retailer’s bottom line. It comes down to intent and implementation. If supermarkets want long-term buy-in, they must communicate how this tech serves the shopper, not just the store.”
Ultimately, facial recognition brings a whole host of concerns that retailers must thoroughly consider when trialling or fully implementing this technology into stores.
With retail crime rising, and the safety of retail staff under threat, it is understandable that supermarkets are looking into new technology to offer an additional layer of protection. Facial recognition is going through a period of testing and learning, but UK supermarkets will have to weigh up whether jeopardising shopper trust is worth the benefits that it can bring.



