Facebook’s Technocratic Reports Hide Its Failures on Abuse

These reports obscure a torrent of hate speech and other toxic content

Chris Gilliard
OneZero

--

Facebook’s moderation reports mask urgent problems with language written by and for the tech elite. Photo illustration; image sources: Chip Somodevilla, William Whitehurst/Getty Images

If you’ve never read a Facebook “Community Standards Enforcement Report,” you probably haven’t witnessed the technocratic language used to obscure the harm done by the platform. Bluntly, it’s appalling.

The report offers statistics about how much more effective the company is getting at removing material that violates its community standards. The most recent edition, covering April through June, was published earlier this month. These numbers are deceptive because Facebook’s team often speaks of solving a problem it directly created. It would be as if a meat-packer was both intentionally poisoning its meat while also releasing stats about making the meat safer.

Facebook hosts, recommends, and amplifies hate as part of its business model while also touting its success at taking that hate down. One only needs to look at the “Kenosha Guard” militia group that may have spurred the double-murder of protesters earlier this week to see the impacts of this business model in real life. The technocratic language Facebook uses to describe its takedown numbers serves to hide the fact that behind such metrics are very real human beings who are targeted and traumatized by material that only…

--

--

Chris Gilliard
OneZero

Dr. Chris Gilliard is a writer, professor and speaker. His scholarship concentrates on digital privacy, and the intersections of race, class, and technology.