‘The Inevitable Life Cycle’ of Platforms Like Facebook and Pornhub
‘If you’re going to have users generating content, you’re going to have users generating harmful content’
--
Last week, Evelyn Douek, a Harvard lecturer who studies online speech, spoke to OneZero’s Will Oremus about a rising tide of moderation controversies across platforms as diverse as Peloton, Pornhub, Facebook, and YouTube.
Peloton has hosted QAnon hashtags; Pornhub has a deeply troubling child pornography issue; and there’s not space enough in this post to enumerate the issues with those other two.
“If you’re going to have users generating content, you’re going to have users generating harmful content,” Douek said.
The responses to these problems have become predictable and, in some ways, ineffectual, she said. Oremus paraphrased a four-point theory Douek has developed to describe the “inevitable life cycle of a user-generated content platform”:
- Someone, often an activist or journalist, finds abhorrent content on Platform X.
- It becomes a scandal, and pressure builds as more examples are uncovered.
- The platform eventually takes action, but does so in a haphazard and reactive way, geared toward solving the PR problem rather than the underlying issues.
- Sooner or later, its inconsistent approach to moderation sparks a backlash of its own.
But the predictability of content moderation problems doesn’t mean they can be ignored, of course. Douek suggests a couple of approaches for a healthier online ecosystem: greater transparency from platforms, and long-term thinking for policies that will be consistently actionable.
For more on this, read the latest edition of Pattern Matching, OneZero’s free weekly newsletter about the most compelling stories in tech, by following the link below: