Sainsbury's Facial Recognition System Wrongly Identifies Innocent Shopper as Offender
Warren Rajah, a 42-year-old data strategist from Elephant and Castle in south London, has described feeling like a "criminal" after being mistakenly identified by facial recognition technology at his local Sainsbury's store. The incident occurred on January 27 when store staff approached Mr Rajah, demanded he leave the premises, and confiscated his shopping items without providing an immediate explanation.
Distressing Store Ejection Following Technology Error
According to Mr Rajah's account, when he questioned why he was being removed from the store while visibly distressed, staff members simply pointed to a sign indicating the supermarket used facial recognition systems. He was later informed that the Facewatch technology had incorrectly flagged him as matching an offender who was actually present in the store at the same time.
"You feel horrible, you feel like a criminal and you don't even understand why," Mr Rajah told the Press Association. "To tell you to leave the store without any explanation gives you the impression that you've done something wrong. If you speak to anyone in the public, that is what they will tell you, when you've been forced and excluded from an environment, you automatically think you've done something wrong, especially with security. That's just a normal human response."
Sainsbury's Response and Compensation Offer
Following the incident, Mr Rajah contacted Facewatch directly, providing a copy of his passport and a photograph to verify his identity. The company confirmed he was not listed in their database. Sainsbury's subsequently issued a formal apology and offered Mr Rajah a £75 shopping voucher as compensation for the distressing experience.
A Sainsbury's spokesperson stated: "We have been in contact with Mr Rajah to sincerely apologise for his experience in our Elephant and Castle store. This was not an issue with the facial recognition technology in use but a case of the wrong person being approached in store."
Facial Recognition Deployment and Accuracy Claims
The UK's second largest supermarket chain has implemented Facewatch technology across seven stores as part of broader efforts to combat rising retail crime and identify shoplifters. Sainsbury's website claims the system maintains a "99.98% accuracy rate" with every alert reviewed by trained staff before any action is taken.
The system generates alerts based on criminal behaviour data submitted by the store itself or other retailers using Facewatch in the vicinity. However, Mr Rajah expressed strong reservations about continuing to shop at Sainsbury's and wants increased public awareness about facial recognition deployment in retail environments.
"It's borderline fascistic as well, how can you just have something done to you and not have an understanding? How can you be excluded from a space and not have an understanding or an explanation?" Mr Rajah questioned.
Facewatch and Regulatory Perspectives
A Facewatch spokesperson acknowledged the incident, stating: "We're sorry to hear about Mr Rajah's experience and understand why it would have been upsetting. This incident arose from a case of human error in store, where a member of staff approached the wrong customer. Our data protection team followed the usual process to confirm his identity and verified that he was not on our database and had not been subject to any alerts generated by Facewatch."
The company added that when individuals submit subject access requests, their data is not stored or used for other purposes and is deleted after identity verification.
Privacy Advocates Express Serious Concerns
Jasleen Chaggar of Big Brother Watch highlighted broader implications: "The idea that we are all just one facial recognition mistake away from being falsely accused of a crime or ejected from a store without any explanation is deeply chilling. To add insult to injury, innocent people seeking remedy must jump through hoops and hand over even more personal data just to discover what they're accused of. In the vast majority of cases, they are offered little more than an apology when companies are finally forced to admit the tech got it wrong."
Ms Chaggar noted that her organization "regularly hears from members of the public who are left traumatised after being wrongly caught in this net of privatised biometric surveillance."
Information Commissioner's Office Guidance
The Information Commissioner's Office (ICO) provided measured commentary on the technology's use: "Facial recognition technology can help retailers detect and prevent crime and has clear benefits in the public interest. However, its use must comply with data protection law. Retailers should carefully consider the risks of misidentification and have robust procedures in place to ensure the accuracy and integrity of the personal information they collect and process. This is especially important where personal information is used in situations which can have a serious impact on a person."
The incident raises significant questions about the balance between retail security measures and customer privacy rights, particularly as biometric surveillance becomes more prevalent across the retail sector.