Sainsbury's Facial Recognition Error Leaves Shopper Feeling Like a Criminal
Shopper Feels Like Criminal After Sainsbury's Facial Recognition Error

Sainsbury's Facial Recognition Mishap Leaves Shopper Feeling Like a Criminal

A data strategist has described feeling like a "criminal" after being mistakenly identified as an offender by facial recognition software at a Sainsbury's supermarket, leading to his removal from the store and confiscation of his purchases.

Distressing Incident at Elephant and Castle Store

Warren Rajah, 42, from Elephant and Castle in south London, was shopping at his local Sainsbury's branch on 27 January when he was approached by staff members. Employees asked him to leave the premises and confiscated the items he intended to purchase, leaving him "distraught" and confused by the sudden exclusion.

When Mr Rajah questioned the decision, staff reportedly pointed to a sign indicating the store's use of facial recognition technology. It was later revealed that he had been confused with another individual who was present in the store at the same time and was listed as an offender in the system.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

"You feel horrible, you feel like a criminal and you don't even understand why," Mr Rajah told the Press Association. "To tell you to leave the store without any explanation gives you the impression that you've done something wrong."

Apology and Compensation Offered

After being removed from the store, Mr Rajah contacted Facewatch, the company providing the facial recognition technology. He sent a copy of his passport and a photo of himself, after which Facewatch confirmed he was not on their database. Sainsbury's subsequently apologised for the incident and offered him a £75 shopping voucher as compensation.

A spokesperson for Sainsbury's stated: "We have been in contact with Mr Rajah to sincerely apologise for his experience in our Elephant and Castle store. This was not an issue with the facial recognition technology in use but a case of the wrong person being approached in store."

Technology Deployed to Combat Retail Crime

The UK's second largest supermarket chain has implemented facial recognition technology in seven of its stores as part of efforts to identify shoplifters and address a significant increase in retail crime in recent years. According to Sainsbury's website, the system boasts a "99.98% accuracy rate" and every alert is reviewed by trained colleagues before any action is taken.

The system issues alerts based on criminal behaviour submitted by the store or other retailers using Facewatch in the vicinity. However, Mr Rajah expressed that he now has "no interest" in shopping at Sainsbury's and wants to raise public awareness about the use of facial recognition technology in retail environments.

"It's borderline fascistic as well, how can you just have something done to you and not have an understanding? How can you be excluded from a space and not have an understanding or an explanation?" he questioned.

Human Error Blamed for Misidentification

A Facewatch spokesperson explained: "We're sorry to hear about Mr Rajah's experience and understand why it would have been upsetting. This incident arose from a case of human error in store, where a member of staff approached the wrong customer."

The spokesperson added that their data protection team followed standard procedures to confirm Mr Rajah's identity, verifying that he was not on their database and had not been subject to any alerts generated by Facewatch. They noted that when someone makes a subject access request, the data is not stored or used for other purposes and is deleted after identity verification.

Privacy Advocates Express Concern

Jasleen Chaggar of Big Brother Watch warned: "The idea that we are all just one facial recognition mistake away from being falsely accused of a crime or ejected from a store without any explanation is deeply chilling."

She continued: "To add insult to injury, innocent people seeking remedy must jump through hoops and hand over even more personal data just to discover what they're accused of. In the vast majority of cases, they are offered little more than an apology when companies are finally forced to admit the tech got it wrong."

Pickt after-article banner — collaborative shopping lists app with family illustration

Ms Chaggar revealed that her organisation "regularly hears from members of the public who are left traumatised after being wrongly caught in this net of privatised biometric surveillance."

Regulatory Guidance on Facial Recognition Use

The Information Commissioner's Office (ICO) commented on the broader implications: "Facial recognition technology can help retailers detect and prevent crime and has clear benefits in the public interest. However, its use must comply with data protection law."

The ICO emphasised that "retailers should carefully consider the risks of misidentification and have robust procedures in place to ensure the accuracy and integrity of the personal information they collect and process." They noted this is particularly important when personal information is used in situations that can have serious impacts on individuals.

The incident highlights growing concerns about the implementation of surveillance technologies in everyday settings and the potential consequences of errors in systems designed to enhance security.