How Content Moderation Turned Into a Game of Hot Potato

App stores and cloud hosting platforms want the right to ban content without the responsibility of moderation

Eric Ravenscraft
OneZero

--

Photo: Smith Collection/Gado/Getty Images

When app stores and cloud hosting platforms banned Parler earlier this month after the self-described “free speech” social network failed to moderate calls for violence, they set a new precedent. Previously, the conventional wisdom was that developers bore the responsibility of policing an app’s community. After all, the developer is in the best position to know what its users need, what they’re up to, and how to build the specific moderation tools that work best for its community.

But with the Parler bans, companies that hosted the app — in an app store, on web hosting, or in a domain registrar — asserted a say over whether the community within the app was being moderated effectively.

While major social networks like YouTube and Facebook have at least attempted to spell out exactly what content is allowed, moderation by these host companies is largely nonexistent, even as they reserve the right to ban their users.

Apple gave Parler an ultimatum: Come back with a plan to moderate your community or be removed from the App Store. This policy might explain why Facebook, which has also been used to…

--

--

Eric Ravenscraft
OneZero

Eric Ravenscraft is a freelance writer from Atlanta covering tech, media, and geek culture for Medium, The New York Times, and more.