I/O

The Scariest Thing About DeepNude Wasn’t the Software

Though the creator pulled the app, a deeply sexist culture persists

Lux Alptraum
OneZero
Published in
5 min readJul 16, 2019

--

A screenshot taken of the Twitter account the DeepNudeApp — an application that allegedly used A.I. to undress women.
A screenshot taken on June 28, 2019 of the Twitter account and logo of the DeepNudeApp — an application that allegedly used A.I. to undress women. The creator of the app shut it down after a social media uproar over its impact. Credit: AFP/Getty

AtAt the end of June, Motherboard reported on a new app called DeepNude, which promised — “with a single click” — to transform a clothed photo of any woman into a convincing nude image using machine learning. In the weeks since this report, the app has been pulled by its creator and removed from GitHub, though open source copies have surfaced there in recent days.

Most of the coverage of DeepNude has focused on the specific dangers posed by its technical advances. “DeepNude is an evolution of that technology that is easier to use and faster to create than deepfakes,” wrote Samantha Cole in Motherboard’s initial report on the app. “DeepNude also dispenses with the idea that this technology can be used for anything other than claiming ownership over women’s bodies.” With its promise of single-click undressing of any woman, it made it easier than ever to manufacture naked photos — and, by extension, to use those fake nudes to harass, extort, and publicly shame women everywhere.

But even following the app’s removal, there’s a lingering problem with DeepNude that goes beyond its technical advances and ease of use. It’s something older and deeper, something far more…

--

--

Lux Alptraum
OneZero

OneZero columnist, Peabody-nominated producer, and the author of Faking It: The Lies Women Tell About Sex — And the Truths They Reveal. http://luxalptraum.com