New Coalition Calls to End ‘Racist’ A.I. Research Claiming to Match Faces to Criminal Behavior

‘Criminality cannot be predicted. Full stop.’

Dave Gershgorn
OneZero
Published in
4 min readJun 23, 2020

--

Photo: NurPhoto/Getty Images

More than 500 experts on artificial intelligence, technology, and sociology have signed a letter addressed to a major academic publisher asking to halt the publication of any research that uses machine learning to predict whether someone might commit a crime.

The letter was written in response to a May 5 press release from Harrisburg University in Pennsylvania, which stated that two professors and a PhD student had created software that could predict whether someone was likely to commit a crime based on nothing more than a picture of their face. The publisher, Springer Nature, was slated to include that paper in an upcoming book series.

Harrisburg’s press release claimed that the new software is 80% accurate and completely unbiased, meaning it supposedly contained no statistical predisposition to predict someone was more likely to be a criminal based on their race or gender. It also pitched the software as a tool for law enforcement.

The university, which did not immediately respond to a request for comment, took down its press release on May 6 and said it would post an update after the research was published. Springer Nature told organizers on Monday that the paper was eventually rejected during the review process.

But the experts looking to stop this research, who have named their group the Coalition for Critical Technology, say the paper’s goal to predict criminality was only the latest in a series of similarly unscientific efforts that continue to be laundered through mainstream academia. While the science behind these studies has been debunked, papers attempting to draw a correlation between someone’s face and their behavior continue to be published.

The coalition’s broader aim is to send a message to publishers that the very idea of using machine learning to predict criminality is not scientifically sound and should not be peer reviewed or published in the future.

“Black women and Black scholars and practitioners have led the critique on this work over the last 50 years,” Theodora Dryer, a research scientist at NYU who helped organize…

--

--

Dave Gershgorn
OneZero

Senior Writer at OneZero covering surveillance, facial recognition, DIY tech, and artificial intelligence. Previously: Qz, PopSci, and NYTimes.