The One Rule of Content Moderation That Every Platform Follows

For YouTube, Facebook and the rest, if a decision becomes too controversial, change it

Will Oremus
OneZero

--

Photo: Szabo Viktor/Unsplash

FFacing pressure to ban a hatemonger, a tech company like Facebook, Google, or Twitter initially demurs, saying that while some may find his speech (or her speech, but usually his) objectionable, it doesn’t violate the platform’s rules. After a torrent of outrage, the tech company changes its mind and takes some form of action. Activists claim victory, conservatives cry censorship, and eventually, the controversy dies down — until the next time.

It’s a cycle that we witnessed last year with Facebook and the conspiracy theorist Alex Jones, the host of Infowars. The social network first declined to take action against Jones’ pages, saying it would be “contrary to the basic principles of free speech.” After three weeks of public pressure, Facebook changed its stance, saying, “We believe in giving people a voice, but we also want everyone using Facebook to feel safe.”

It played out again last week with Google and Stephen Crowder. Crowder is a right-wing comedian who had used his YouTube channel to lob homophobic taunts at the Vox writer Carlos Maza. On June 4, Google-owned YouTube indicated that it had carefully reviewed Maza’s complaints and found…

--

--