Tennessee Grandmother Wrongfully Jailed After AI Facial Recognition Error in Fraud Investigation
Angela Lipps, a 50-year-old grandmother from Tennessee, has described her ordeal of rebuilding her life after an artificial intelligence (AI) facial recognition system mistakenly identified her as a suspect in a North Dakota bank fraud case. According to reports from InForum, a south-east North Dakota news outlet, Lipps spent nearly six months in jail due to this error, despite never having visited the state or committed the crimes.
Arrest and Extradition Process
In July, US marshals arrested Lipps at her Tennessee home while she was babysitting four children. She recounted being taken away at gunpoint and booked into a county jail as a fugitive from justice from North Dakota. Lipps, a mother of three and grandmother of five who has lived most of her life in north central Tennessee, said she had never been on an airplane until authorities flew her to North Dakota last year to face charges.
She remained in a Tennessee jail for nearly four months without bail while awaiting extradition, charged with four counts of unauthorized use of personal identifying information and four counts of theft. Authorities in North Dakota did not transport her from Tennessee until the end of October, 108 days after her arrest, and she appeared in a North Dakota courtroom the next day.
Investigation and AI Error Details
Fargo police records obtained by WDAY News reveal that detectives investigating bank fraud cases in April and May 2025 reviewed surveillance video of a woman using a fake US army military ID to withdraw tens of thousands of dollars. Officers allegedly used facial recognition software to identify the suspect as Lipps, with a detective reportedly writing in court documents that she appeared to match based on facial features, body type, and hairstyle.
Lipps told WDAY News that no one from the Fargo police department contacted her before the arrest. Her attorney, Jay Greenwood, commented, “If the only thing you have is facial recognition, I might want to dig a little deeper.”
Release and Aftermath
Lipps was released on Christmas Eve after Greenwood obtained her bank records and presented them to investigators, showing she was more than 1,200 miles away in Tennessee at the time of the alleged fraud in Fargo. However, Fargo police did not pay for her trip home, leaving her stranded. Local defense attorneys helped cover a hotel room and food on Christmas Eve and Christmas Day, and the non-profit F5 Project assisted in her return to Tennessee.
Back home, Lipps said the experience has had lasting consequences: while jailed and unable to pay bills, she lost her home, car, and dog. She also noted that no one from the Fargo police department has apologized.
Broader Context of AI Errors in Law Enforcement
This case is not isolated; similar incidents have occurred due to AI errors. In October, an AI system reportedly mistook a Baltimore high school student’s bag of Doritos for a firearm, leading police to approach and search him, finding nothing. Earlier this year, police in the UK arrested a man for a burglary in a city he had never visited after face scanning software confused him with another person of south Asian heritage, using automated facial recognition that matched him with footage of a suspect 100 miles away.
These events underscore ongoing concerns about the reliability and ethical use of AI technology in criminal investigations, prompting calls for stricter verification processes and accountability measures to prevent wrongful arrests and their devastating impacts on individuals’ lives.
