I/O

The Problem With Instagram’s Plan to Reduce ‘Sexually Suggestive’ Content

EEarlier this month, Instagram debuted a new content moderation policy focused on “reducing the spread of posts that are inappropriate but do not go against Instagram’s Community Guidelines” — or, to put it more bluntly, making it harder for users to promote and find content deemed by the platform to be violent, graphic, or sexually suggestive.

While the new policy may help reduce the spread of disturbing content within the app, it also has some users worried that they’ll find their posts, and potentially even their entire accounts, hidden away due to an overly broad application of the nebulous label “sexually suggestive.” While the policy won’t scrub content from the platform altogether, it will prevent certain imagery from surfacing in the app’s popular “Explore” tab, once called the “realest place on the web.”

Will Ruben, Instagram’s product lead overseeing Discovery, told TechCrunch that these new guidelines will be implemented by machine learning algorithms. Human content moderators will reportedly train the system to identify posts that push the boundaries of good taste.

Different moderators may come to the project with wildly different ideas of what, exactly, “sexually suggestive” actually means.

Experts reached by OneZero voiced a number of concerns about the update, which seems destined to disproportionately affect women, especially body-positive influencers, women of color, and trans activists.

Suresh Venkatasubramanian, a professor at the University of Utah’s School of Computing, says there are multiple ways this sort of system could go wrong. Establishing consistent labels that an algorithm can learn and enforce is already a tricky task, and much more so when the label is something as vague and subject to debate as “sexually suggestive.” Different moderators may come to the project with wildly different ideas of what, exactly, “sexually suggestive” actually means, creating far more confusion than clarity.

“Is [the question] merely, ‘Is it suggestive or not?’ Is there more nuance than this?” Venkatasubramanian asks.

Venkatasubramanian pointed out that something like the MPAA’s film rating system offers context for why a film might receive a PG-13 versus an R. But absent such explicit guidance and guidelines, it’s likely that the algorithm will be informed by an aggregate of potentially conflicting standards set by the moderation team, leading to moderation that either casts an incredibly wide net or winds up with a standard that feels inconsistently applied. That’s something we’ve already seen with content moderation algorithms that flagged photos of Burt Reynolds’ famous Cosmopolitan centerfold as pornography or failed to recognize live videos of the New Zealand mosque shooting as violent content.

There are already some signs that Instagram’s moderation algorithm is going to be shaped by a biased idea of indecency—one that sets a far lower bar for women than for men. TechCrunch obtained sample photos about the moderation effort, including two images that could be considered sexually suggestive. One featured a fully clothed, headless man grabbing at his crotch; the other, a woman in lingerie seated on a bed. You could interpret this to mean that men are deemed to be sexually suggestive by virtue of their actions, while women gain that status simply by having a body that someone else deems sexually attractive.

“We want to be transparent with the community about these guidelines, and we’re thinking through the best way to do that.”

Instagram declined to offer OneZero more information about how its algorithms would be trained. A spokesperson says, “We’re not sharing that level of information at the moment, because we don’t want people to game the system. We want to be transparent with the community about these guidelines, and we’re thinking through the best way to do that.”

It’s not just erotically themed accounts that stand to be affected by this policy. Anna Lauren Hoffmann, an assistant professor with the Information School at the University of Washington, explains that there’s “a legacy of conflating women’s bodies… with deviance, or lewdness, or distraction, or, directly, inappropriateness,” one that Instagram’s content guidelines appear to be drawing from. Notably, it’s a legacy that does not affect all women equally. As Hoffmann reminds me, our cultural ideas of appropriate appearance and behavior tend to “have particular consequences for particular kinds of women,” such as women of color, disabled women, trans women, fat women, and other women whose bodies are frequently labeled “inappropriate” merely for having the audacity to exist.

Laura Delarato, a body image activist and creator of the newsletter 1-800-HEYLAURA, which offers readers a space to explore their feelings about sex and their bodies, uses Instagram to provide body-positive inspiration to her thousands of fans. She has found that, as a fat woman, she’s subjected to stricter scrutiny than her skinny or male counterparts.

“Men can just walk seamlessly into this Instagram world without bias, because their bodies are already deemed as the norm,” Delarato says, noting that someone like plus-size male model Zach Miko can pose half-naked without being deemed sexually suggestive, in part due to Instagram’s long controversial stance that male nipples are more socially appropriate than their female counterparts. In contrast, women — and particularly marginalized women — are presumed to exist primarily for male consumption, their images sexualized simply by virtue of their existence.

Men are deemed to be sexually suggestive by virtue of their actions, while women gain that status simply by having a body that someone else deems sexually attractive.

“We have this internal understanding that women are ornaments,” Delarato says. The idea of women exposing skin for reasons beyond titillation is unfathomable to many people, even as women regularly post scantily clad photos as a way to put out a message of body positivity, normalize being a cancer survivor, document their fitness achievements, or even celebrate a bond with an infant. And Instagram — a company that, Hoffmann reminds me, is ultimately overseen by a man who launched his tech career with a website devoted to ranking the appearances of Harvard women — seems ill-prepared to incorporate such a nuanced view of women, and women’s bodies, into its new moderation policy.

Social media sites like Instagram launched with the promise of giving a platform to everyone, from every background, enabling individuals to express themselves and marginalized voices to be heard. And yet, with its new content policy, Instagram seems set to remove all nuance, intention, and cultural context from its users’ posts, subjecting everyone to a single, biased definition of “sexually suggestive.” It’s a strategy that dehumanizes users, robbing them of their ability to assign their own meaning and message to the content they post — and it’s a strategy that is likely to cause the most harm to some of the platforms’ already marginalized users.

OneZero columnist, Peabody-nominated producer, and the author of Faking It: The Lies Women Tell About Sex — And the Truths They Reveal. http://luxalptraum.com

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store