Pattern Matching

How Facebook Can Prevent Its Next Deadly ‘Operational Mistake’

If Zuckerberg cares at all about the platform’s impact, he should stop outsourcing content moderation now

Will Oremus
Published in
9 min readSep 5, 2020
Photo: Graeme Jennings/Getty Images

Welcome back to Pattern Matching, OneZero’s weekly newsletter that puts the week’s most compelling tech stories in context.

On the morning of August 25, a self-proclaimed militia group on Facebook called Kenosha Guard put out a public call for people to “take up arms” and defend the city of Kenosha, Wisconsin, from “evil thugs” — that is, people protesting the police shooting of Jacob Blake, an unarmed Black man, two days earlier. Responses rolled in throughout the day, including ones like, “I fully plan to kill looters and rioters tonight,” according to an investigation published Thursday by BuzzFeed News.

Horrified Facebook users flagged the event to the company at least 455 times, BuzzFeed reported — and Facebook moderators replied that it didn’t violate the platform’s rules. They left it up, and that evening, two protesters were shot and killed after clashes with militia-style groups of young men with rifles. (It isn’t yet clear whether the gunman was inspired by the Facebook group specifically or heard about the event through some other channel.)

Facebook CEO Mark Zuckerberg called the moderation decision an “operational mistake” made by contractors. The Kenosha Guard page and event did violate Facebook’s rules, he said, and should have been taken down.

The Pattern

There’s no longer any excuse for outsourcing content moderation.

  • Zuckerberg is right that there’s a serious problem with Facebook’s content moderation processes. But it’s not a mistake. It’s a policy decision that Facebook made long ago, at the highest levels, to outsource the vast, traumatic, nigh-impossible task of moderating a platform used by some 2.7 billion people around the world. By delegating to third-party contractors the critical work of content moderation, Facebook has abdicated its responsibility to both its users and society at large.
  • Content moderation is hard. It’s hard philosophically to make fair and consistent judgments about…