Sainsbury's Facial Recognition Error Brands Innocent Shopper as Criminal
A data strategist has described feeling like a "criminal" after Sainsbury's staff mistakenly identified him as an offender using facial recognition software, leading to his abrupt removal from a supermarket store. Warren Rajah, 42, from Elephant and Castle in south London, was shopping in his local branch on 27 January when he was approached by employees, asked to leave immediately, and had his purchases confiscated without proper explanation.
Distressing Experience Without Explanation
A visibly "distraught" Mr Rajah questioned the decision, with staff reportedly pointing to a sign indicating the store's use of facial recognition technology. It later emerged he had been confused with another individual who was listed as an offender in the system and was coincidentally present in the store at the same time. Sainsbury's has since apologised to Mr Rajah, stating there was no fault with the Facewatch technology itself, which is currently deployed in seven of its stores nationwide.
On being misidentified, Mr Rajah told the Press Association: "You feel horrible, you feel like a criminal and you don't even understand why." He elaborated further, saying: "To tell you to leave the store without any explanation gives you the impression that you've done something wrong. If you speak to anyone in the public, that is what they will tell you, when you've been forced and excluded from an environment, you automatically think you've done something wrong, especially with security. That's just a normal human response."
Aftermath and Corporate Response
Mr Rajah explained that after being removed from the store he contacted Facewatch directly, which told him he was not on its database after he sent a copy of his passport and a photograph of himself. Sainsbury's later apologised and offered him a £75 shopping voucher as compensation for the distressing incident. A spokesperson for the firm stated: "We have been in contact with Mr Rajah to sincerely apologise for his experience in our Elephant and Castle store. This was not an issue with the facial recognition technology in use but a case of the wrong person being approached in store."
The UK's second largest supermarket chain has defended the technology as part of its efforts to identify shoplifters and curb a sharp increase in retail crime in recent years. Its website claims that the system has a "99.98% accuracy rate and every alert is reviewed by trained colleagues before any action is taken." Sainsbury's clarified that the system issues an alert based on criminal behaviour submitted by the store or other retailers using Facewatch nearby.
Growing Concerns About Biometric Surveillance
However, Mr Rajah said that he now has "no interest" in shopping at Sainsbury's and wants people to be aware of facial recognition technology being used in stores. He expressed strong concerns, stating: "It's borderline fascistic as well, how can you just have something done to you and not have an understanding? How can you be excluded from a space and not have an understanding or an explanation?"
A Facewatch spokesperson responded: "We're sorry to hear about Mr Rajah's experience and understand why it would have been upsetting. This incident arose from a case of human error in store, where a member of staff approached the wrong customer. Our data protection team followed the usual process to confirm his identity and verified that he was not on our database and had not been subject to any alerts generated by Facewatch." They added that if someone makes a subject access request, the data is not stored or used for any other purpose and is deleted after the individual proves their identity.
Rights Groups Call for Stronger Safeguards
Jasleen Chaggar of Big Brother Watch warned: "The idea that we are all just one facial recognition mistake away from being falsely accused of a crime or ejected from a store without any explanation is deeply chilling. To add insult to injury, innocent people seeking remedy must jump through hoops and hand over even more personal data just to discover what they're accused of. In the vast majority of cases, they are offered little more than an apology when companies are finally forced to admit the tech got it wrong."
Ms Chaggar revealed that the organisation "regularly hears from members of the public who are left traumatised after being wrongly caught in this net of privatised biometric surveillance." The Information Commissioner's Office (ICO) commented: "Facial recognition technology can help retailers detect and prevent crime and has clear benefits in the public interest. However, its use must comply with data protection law. Retailers should carefully consider the risks of misidentification and have robust procedures in place to ensure the accuracy and integrity of the personal information they collect and process. This is especially important where personal information is used in situations which can have a serious impact on a person."



