GENERAL INTELLIGENCE

Men Wear Suits, Women Wear Bikinis: Image Generating Algorithms Learn Biases ‘Automatically’

The algorithms also picked up on racial biases linking Black people to weapons

Dave Gershgorn
OneZero
Published in
4 min readJan 29, 2021
Photo illustration; Image source: 3DSculptor/Getty Images

OneZero’s General Intelligence is a roundup of the most important artificial intelligence and facial recognition news of the week.

Bias in artificial intelligence is notoriously problematic. Facial recognition algorithms have been found to more frequently misidentify people with darker skin; identified poor people as high-risk and less eligible for public services; and have recommended resources to healthier white people before recommending the same resources to sicker Black people.

Now, new research shows that similar racial, gender, and intersectional biases are “automatically” learned by popular computer vision algorithms when learning from a popular image dataset, according to a paper from Carnegie Mellon and George Washington University. The algorithms studied, OpenAI’s iGPT and Google’s SimCLRv2, are general purpose and can be adapted into nearly any use from image generation to object recognition. The dataset, ImageNet, is arguably the most popular dataset in computer vision and kicked off the A.I. boom we’re currently experiencing.

To see how the algorithm associated male and female faces with body types and clothes, an algorithm was given the image of a person’s head and asked to generate a low-resolution image of their body. It’s a process a lot like the auto-complete function on your phone, just with images.

The original paper featured an example with New York Rep. Alexandria Ocasio-Cortez. The algorithm associated AOC with what it learned about women from the ImageNet dataset it was trained on, and associated her face with other visual traits it associated with women. This resulted in images of the congresswoman in bikinis and revealing shirts.

The result fell in line with other tests, according to Ryan Steed, a Carnegie Mellon PhD student and co-author of the paper. About 40% of the time, male faces would be associated with suits or business attire. More than 50% of the time, women would be associated with more sexualized images of bikinis or…

--

--

Dave Gershgorn
OneZero

Senior Writer at OneZero covering surveillance, facial recognition, DIY tech, and artificial intelligence. Previously: Qz, PopSci, and NYTimes.