Member-only story
Pattern Matching
What Pornhub and Peloton Have in Common With Facebook
An online speech expert explains why no online platform will be spared from content-moderation controversy

For years a battle of ideas has raged over the limits of online speech, focused largely on Facebook, Twitter, and to a lesser extent YouTube. Innately resistant to the messy and expensive work of policing users’ speech, those vast platforms have grudgingly enlarged their moderation workforces, expanded their content policies, and toughened their enforcement in response to media backlashes, congressional hearings, regulatory threats, advertiser boycotts, and revolts from their own employees. Professional racists such as Milo Yiannopoulos, conspiracy theorists such as Alex Jones, and even grassroots movements such as QAnon have all been booted from major platforms for violating policies after significant backlash. Public officials such as Donald Trump now find themselves fact-checked or their posts hidden. Misinformation about Covid-19 and voting is being throttled or taken down.
These battles are far from over, and may never be. But for all the attention paid to a few high-profile platforms, countless other online forums have largely avoided scrutiny of their content policies — so far.
That will change, says Evelyn Douek, a law lecturer at Harvard who researches online speech. And it will change in a hurry, she believes. She has developed a working theory of online speech that explains why, and offers some clues for what platforms should be doing about harmful expression — ideally before they find themselves in the headlines for the wrong reasons.
The Pattern
Content moderation controversies aren’t just for social networks anymore.
- On Friday, New York Times columnist Nick Kristof published an exposé about Pornhub, the massive pornography site that has sought to position itself as a more or less respectable internet giant with a mischievous streak. His reporting makes the case that Pornhub systematically profits from heinous sexual crimes by hosting and monetizing user-posted videos without sufficiently careful oversight. From the story: