I/O
The Scariest Thing About DeepNude Wasn’t the Software
Though the creator pulled the app, a deeply sexist culture persists
At the end of June, Motherboard reported on a new app called DeepNude, which promised — “with a single click” — to transform a clothed photo of any woman into a convincing nude image using machine learning. In the weeks since this report, the app has been pulled by its creator and removed from GitHub, though open source copies have surfaced there in recent days.
Most of the coverage of DeepNude has focused on the specific dangers posed by its technical advances. “DeepNude is an evolution of that technology that is easier to use and faster to create than deepfakes,” wrote Samantha Cole in Motherboard’s initial report on the app. “DeepNude also dispenses with the idea that this technology can be used for anything other than claiming ownership over women’s bodies.” With its promise of single-click undressing of any woman, it made it easier than ever to manufacture naked photos — and, by extension, to use those fake nudes to harass, extort, and publicly shame women everywhere.
But even following the app’s removal, there’s a lingering problem with DeepNude that goes beyond its technical advances and ease of use. It’s something older and deeper, something far more…