In the mid-1990s, when the internet was still relatively new, Congress passed a 26-word provision as part of the Communications Decency Act that has shaped social media and online platforms ever since.
The provision, known as Section 230, says that internet companies are not liable for their users’ content. It means that Twitter can’t be sued for users’ defamatory tweets, you can’t be sued for the comments that someone else leaves on your blog, and whatever people post to Facebook is on them, not Facebook.
For decades, this premise has been a fundamental component of how the internet is run. But in recent years, Section 230 has been under fire from those who believe it protects tech companies at the expense of their users.
One of the most common arguments for amending Section 230 is that it enables revenge porn. The staff of C.A. Goldberg, a victims’ rights law firm known for its aggressive work on behalf of the victims of revenge porn, called interpretation of the provision “so expansive, it protects websites and social media companies that allow users to upload your naked pictures without permission,” a protection which, the firm argues, encourages tech platforms to do nothing in response to revenge porn.
C.A. Goldberg isn’t the only law firm to take this stance: in a series of lawsuits in California, Georgia, Missouri, and Texas, personal injury lawyer Annie McAdams has been challenging the notion that platforms like Facebook shouldn’t be held accountable for the abuses committed by their users. In a piece about her crusade published in the New York Times, McAdams argues that the protections afforded by Section 230 are akin to making it impossible to sue a lawnmower retailer whose product causes bodily harm. While neither of C.A. Goldberg or McAdams expects to wholly overturn Section 230, both hope the government will rewrite it — or the courts will interpret in a way that feels more friendly to abuse victims.
Yet focusing solely on Section 230’s role in enabling abuse on the internet misses other ways it affects women and other marginalized groups. Despite the provision’s role in making it harder to hold platforms accountable for revenge porn, it also allows companies to moderate the content on their platforms without fear of legal reprisal should they fall short — which they inevitably do. Some fear that repealing or amending it might actually make things worse for women, queer people, and other marginalized folks who rely on the internet for promotion, organization, and community.
“People frequently think about Section 230 solely in terms of how it protects platforms from liability,” says Corynne McSherry, legal director for the Electronic Frontier Foundation (EFF). But, she notes, Section 230 also creates a space for platforms to attempt to take a nuanced stance towards moderation. Prior to Section 230, a platform that attempted to moderate and create a healthy community could face legal liability if they failed in that mission in a way that they wouldn’t if they’d refused to do any moderation at all. “Without Section 230 protections,” McSherry explains, “you’d have two incentives: you’d have an incentive for platforms to do nothing… or, alternatively, to over censor” — a strategy which, more often than not, results in the censorship of the marginalized, rather than the powerful. As an EFF representative noted in the New York Times, policies as seemingly mundane as the requirement that users register their accounts under their legal names are often unevenly enforced, with indigenous and non-Western names more likely to be flagged as fake — even when they are real. And no matter how strict social media sites might get about hate speech and harassment, no one seems willing to enforce those rules when it comes to President Donald Trump’s Twitter account.
It’s not difficult for the pro-Section 230 crowd to find an example that backs up their argument. In April 2018, Congress passed SESTA-FOSTA, an amendment to Section 230 that allows platforms to be prosecuted when sex trafficking is discovered on their services. In theory, SESTA-FOSTA is a narrow law that only impacts platforms that are profiting from exploitation and abuse; its proponents promoted it as a way to take on sex traffickers and the platforms that enable their abuses.
In practice, the law hasn’t had that promised impact on trafficking: instead, it’s led to broad, sweeping content restrictions that have eliminated online safety resources for sex workers and, many sex workers rights advocates argue, increased, rather than decreased, the risk of trafficking by pushing sex workers further underground and making them more vulnerable. Citing reports of grave harm, Democratic Representative Ro Khanna of California recently introduced a bill calling for further study of the impact and effects of the law.
“If we get rid of Section 230, we’re going to be severely impeding our ability to form all of those kinds of communities and make connections and access connection.”
McSherry fears that a similar carve-out aimed at revenge porn could also have unintended effects. Rather than encouraging platforms like Twitter and Facebook to take responsibility for the harms caused by their users, a modification of Section 230 might instead lead to more crackdowns on sexual content in general. Given tech companies’ long history of aversion to nuanced content moderation, it’s not hard to imagine that, faced with the threat of litigation, they’d be less likely to create a nuanced content moderation system that respects the rights of abuse victims while allowing consensual content to be freely posted, and more likely to enact a wholesale ban of anything that could possibly result in accusations of revenge porn. That could include completely legal sexual content that’s been consensually shared by its subject, as well as educational content and content intended to celebrate and uplift marginalized sexual communities.
And if Section 230 were not merely amended further, but revoked entirely, the consequences could be dire, with people attempting to call out sexist abuses, rather than those committing them, who could suffer the most harm. For the past year, EFF has been closely monitoring a defamation lawsuit against Moira Donegan, the creator of the Shitty Media Men List, an anonymously crowdsourced list of abusive men in the media industry. Although the outcome of Donegan’s case has yet to be decided, it’s reasonable to argue it is Section 230 that protects Donegan herself from being punished for the statements of her anonymous contributors — and without Section 230 in place, other people might be less willing, or able, to create similar resources that allow victims to safely come forward about their experiences of abuse.
There’s no question that the internet as we know it is rife with abuse, and that something needs to be done to upend the status quo and stem the distribution of nonconsensually shared content and harassment. But removing a keystone like Section 230 cannot be done without serious consequences — and unfortunately, it’s far more likely that the vulnerable, marginalized people currently in need of protection are the ones who will suffer, rather than benefit if the provision is eliminated.
“As a feminist, I think it’s extremely important for [marginalized people] to be able to find fellow travelers and form community online,” says McSherry. “If we get rid of Section 230, we’re going to be severely impeding our ability to form all of those kinds of communities and make connections and access connection. And I think that would be a terrible, terrible cost.”