Angela Lipps was babysitting four kids at her Tennessee home last July when U.S. Marshals showed up with guns drawn.
She'd never been to North Dakota. Never been on an airplane. But Fargo police had run facial recognition software against surveillance footage of a woman committing bank fraud — and the algorithm said it was her.
👆 Watch the 60-second breakdown above, or read on for the full story.
108 Days Without a Phone Call
Here's what kills me about this case: Fargo police charged Lipps with four counts of identity theft and four counts of theft based on a facial recognition match and a detective's eyeball comparison of her social media photos. That's it.
Nobody called her. Nobody checked if she'd ever been within a thousand miles of Fargo. Nobody pulled her bank records, which would have immediately shown she was buying cigarettes and ordering Uber Eats in Tennessee at the exact times the fraud was happening 1,200 miles away.
Instead, she sat in a Tennessee jail for 108 days as a fugitive — held without bail — before North Dakota even bothered to pick her up. Her lawyer, Jay Greenwood, put it bluntly: "If the only thing you have is facial recognition, I might want to dig a little deeper."
The first time police actually interviewed her was December 19th. She'd been locked up for over five months.
Five days later, on Christmas Eve, the case was dismissed.
Released Into a Blizzard With Summer Clothes
Lipps walked out of Cass County Jail on December 24th wearing the summer clothes she'd been arrested in five months earlier. Fargo. Winter. No coat.
The police department didn't cover her trip home. Local defense attorneys chipped in for a hotel room on Christmas Eve and Christmas Day. A nonprofit called the F5 Project eventually drove her to Chicago so she could get back to Tennessee.
While she was locked up, unable to pay bills, she lost her home. Lost her car. Lost her dog.
No one from Fargo PD has apologized.
When a reporter asked the outgoing police chief about the case at his retirement press conference, he responded: "We are not here to talk about that today."
This Keeps Happening
Lipps isn't an isolated case. She's part of a pattern that's been building for years:
Robert Williams (2020) — Detroit police arrested him in front of his wife and two young daughters based on a facial recognition match to blurry surveillance footage from a watch theft. He spent 30 hours in a dirty, overcrowded cell. The match was wrong. He sued, and the ACLU helped secure a landmark settlement.
Porcha Woodruff (2023) — Eight months pregnant, arrested by Detroit police for carjacking based on a facial recognition hit. Interrogated for 11 hours at the detention center. The prosecutor dismissed the case a month later for insufficient evidence.
Randal Reid (2022) — Arrested in Georgia for a theft in Louisiana, a state he'd never visited. Spent six days in jail over Thanksgiving. The cause? A facial recognition match.
UK man (2026) — Arrested for a burglary in a city he'd never visited after automated facial recognition confused him with another person of South Asian heritage. The suspect was 100 miles away.
Every single one of these people was innocent. Most were Black. The technology's error rates are still higher for darker skin tones — NIST testing shows false positive rates below 0.5% for leading algorithms, but those rates are measurably worse for non-white faces.
A 0.5% error rate sounds tiny until you realize these systems are scanning millions of faces. And when the "error" means someone loses six months of their life, their home, and their dog, the math stops being abstract.
The Technology Isn't the Whole Problem
I want to be precise here because the discourse around facial recognition tends to swing between "ban it all" and "the tech works fine, humans are the problem."
Both miss the point.
Facial recognition is a lead-generation tool. It spits out possible matches. It's not supposed to be the entire investigation — every major vendor says this, NIST says this, even police departments' own policies say this.
But look at what actually happened in Lipps' case: the algorithm returned a match, a detective eyeballed it against social media photos, and that was enough for an arrest warrant. No interview. No alibi check. No bank records. Nothing.
The technology failed, yes. But the bigger failure was a system that treated an algorithm's output as probable cause and then left a woman to rot in jail for five months before doing the basic police work that would have cleared her in an afternoon.
What Needs to Change
I don't think banning facial recognition outright is realistic — it's already too embedded in law enforcement. But the current situation, where there's almost no accountability when it goes wrong, is indefensible.
A few things that should be obvious by now:
Facial recognition matches should never be the sole basis for an arrest. Full stop. Treat it like a tip from an anonymous source — worth investigating, not worth acting on alone. Some jurisdictions already require this, but enforcement is spotty.
There need to be consequences when departments skip basic verification. Lipps' bank records would have cleared her immediately. The fact that nobody checked for five months isn't a technology problem. It's a negligence problem.
Defendants need to know when facial recognition was used. In many jurisdictions, police aren't required to disclose that they used facial recognition to identify a suspect. If you don't know the basis for your arrest, you can't challenge it.
Independent audits of facial recognition accuracy, broken down by demographics. NIST does this at the algorithm level, but departments need to track their own hit rates and false positive rates in real investigations.
Angela Lipps is back home in Tennessee now, rebuilding from nothing. The person who actually committed the bank fraud in Fargo? Still out there. No arrests have been made.
The algorithm got it wrong. The system made it catastrophic.
Sources
- AI error jails innocent grandmother for months in Fargo fraud case — InForum/WDAY original investigation
- Tennessee grandmother jailed after AI facial recognition error — The Guardian
- Williams v. City of Detroit — ACLU case page on Robert Williams' wrongful arrest
- When Artificial Intelligence Gets It Wrong — Innocence Project on Porcha Woodruff's case
- NIST Face Recognition Vendor Test — ongoing accuracy benchmarks
Top comments (0)