Wednesday, January 15, 2025

Arrested by AI: Police Ignore Standards After Facial Recognition Matches

A Washington Post investigation into police use of facial recognition software found that law enforcement agencies across the nation are using the artificial intelligence tools in a way they were never intended to be used: as a shortcut to finding and arresting suspects without other evidence.

More

Arrested by AI: Police ignore standards after facial recognition matches Confident in unproven facial recognition technology, sometimes investigators skip steps; at least eight Americans have been wrongfully arrested. www.washingtonpost.com/business/int ...

[image or embed]

-- David Rosenthal (@davidros.bsky.social) January 13, 2025 at 5:27 PM

Comments

More: After two men brutally assaulted a security guard on a desolate train platform on the outskirts of St. Louis, county transit police detective Matthew Shute struggled to identify the culprits. He studied grainy surveillance videos, canvassed homeless shelters and repeatedly called the victim of the attack, who said he remembered almost nothing because of a brain injury from the beating.

Months later, they tried one more option.

Shute uploaded a still image from the blurry video of the incident to a facial recognition program, which uses artificial intelligence to scour the mug shots of hundreds of thousands of people arrested in the St. Louis area. Despite the poor quality of the image, the software spat out the names and photos of several people deemed to resemble one of the attackers, whose face was hooded by a winter coat and partially obscured by a surgical mask.

Though the city's facial recognition policy warns officers that the results of the technology are "nonscientific" and "should not be used as the sole basis for any decision," Shute proceeded to build a case against one of the AI-generated results: Christopher Gatlin, a 29-year-old father of four who had no apparent ties to the crime scene nor a history of violent offenses, as Shute would later acknowledge.

Arrested and jailed for a crime he says he didn't commit, it would take Gatlin more than two years to clear his name.

Gatlin is one of at least eight people wrongfully arrested in the United States after being identified through facial recognition. Six cases were previously reported in media outlets. Two wrongful arrests " Gatlin and Jason Vernau, a Miami resident " have not been previously reported.

All of the cases were eventually dismissed. Police probably could have eliminated most of the people as suspects before their arrest through basic police work, such as checking alibis, comparing tattoos, or, in one case, following DNA and fingerprint evidence left at the scene.

#1 | Posted by qcp at 2025-01-14 08:51 AM

Of course they do. If you're not held accountable you can do this.

#2 | Posted by fresno500 at 2025-01-14 04:33 PM

We have finger prints for a reason.

The whole thing started because two prison inmates looked identical and the CO's needed to be sure who was who.

Need a pop culture reference?

Ok, Larry David and Bernie Sanders.

I doubt AI could tell them apart if they were in motion.

#3 | Posted by Tor at 2025-01-15 12:05 PM

"In all the known cases of wrongful arrest because of facial recognition, police arrested someone without independently connecting the person to the crime. "

Eyewitness identification is notoriously unreliable. Picking a suspect out from a show-up or line up is NEVER enough for an arrest, and use of AI just makes it more likely to happen. The photo line up was also done in a manner to lead the witness (see video), which is never permissable. There is no excuse for this kind of shoddy investigative work. There is also no excuse for the judge who signed the arrest warrant based on such weak evidence, or the prosecutor who filed the charges. Lots of heads should roll on this one.

www.msn.com

#4 | Posted by Miranda7 at 2025-01-15 12:11 PM

"We have finger prints for a reason"

To aid in gripping wet surfaces?

#5 | Posted by truthhurts at 2025-01-15 12:15 PM

I'd sooner trust a prop that tests food toxicity.

Stick it in mashed potatoes or Vances' couch cushions - "Safe."

Is there correlation between using facial recognition and the possibility to invent or modify data? Like bodycam footage? And r--e kit evidence.

#6 | Posted by redlightrobot at 2025-01-15 01:51 PM

All of this can be avoided by not being poor and not being the wrong color.

Choose wisely.

#7 | Posted by Angrydad at 2025-01-15 02:26 PM

Drudge Retort Headlines

Mike Johnson Vows to Hold Aid to California Hostage (112 comments)

Trump Defense pick Pete Hegseth Grilled on Women in Combat Comments (57 comments)

Fact-Checking About California Wildfires and Water Policy (35 comments)

Citizens United 15 Years Later (30 comments)

Newsom Signs Order to Speed Up Rebuilding After LA Fires (25 comments)

Trump: Create 'External Revenue Service' for Tariff Income (23 comments)

'I Think Things Are Going to Be Bad, Really Bad' (16 comments)

Trump would have been Convicted if not Elected, DoJ report says (15 comments)

GOP House vote Bans Trans athletes from women’s sports (15 comments)

Steve Bannon Condemns Elon Musk as 'racist' and 'truly evil' (15 comments)