The Facebook Oversight Board Is Making the Most of Its Limited Power

Whether you call it a Facebook PR scheme or the Supreme Court for social media, the Facebook Oversight Board is moving beyond its initial remit. Now things get interesting.

The hot takes have poured in following the Facebook Oversight Board’s decision on Donald Trump. On Wednesday, the “independent” board — made up of third parties selected by Facebook — announced it would uphold Facebook’s ban of the former president while asking Facebook to come up with something less arbitrary than an “indefinite” suspension. People called the board a threat to democracy, a Facebook branding campaign, an insufficient check on Facebook’s power, and something more powerful than the United Nations.

In reality, the board is a feeble institution funded and designed by Facebook — not a boogeyman upon which we should project all our fears — but it’s starting to assert itself in some very interesting ways. Facebook originally tasked the board with reviewing content moderation decisions — the least consequential of all choices Facebook makes — but it’s now moving into content policy and pressing Facebook on product design choices, which matter much more.

The board’s Trump decision captures its growing determination to expand beyond its remit. On January 7, 2021, Facebook suspended Trump indefinitely, claiming his posts condoned the Capitol riot and could lead to more violence. Facebook then referred the decision to the board, asking it to review the ban. Instead of rubber-stamping this decision or overturning it, the board came back to Facebook and essentially said, “We can’t find an indefinite suspension anywhere in your rule book, so make some rules and come back to us.” In doing so, it pointed out that some of Facebook’s most important content moderation decisions are made on the whims of its leadership, and it told the company to write better policy to prevent that from happening so often.

“In applying a vague, standardless penalty and then referring this case to the board to resolve, Facebook seeks to avoid its responsibilities,” the board wrote. “The board declines Facebook’s request and insists that Facebook apply and justify a defined penalty.”

The board wasn’t simply turning the Trump decision back to Zuckerberg, as many pundits declared. It was going beyond its capacity as a content moderation review board and demanding Facebook write new policy, moving it closer to the company’s fundamentals. “The board,” Stanford law professor Nate Persily said, “has expanded the range of powers it possesses.”

There are three layers of decisions that shape social media companies’ fate. The least consequential, yet most talked about, are content moderation decisions. These choices are one-off, don’t change the underlying nature of the products that produced them, and ultimately matter little in the long run. (I’ve described these as the “Outputs” vs. “the Machine.”) More consequential are content policy decisions. When you make these choices, you can potentially change how a platform runs, leading to lasting change. Content policies are opaque, platform-generated, and poor targets for public outrage, so they’re rarely discussed.

The Facebook Oversight Board, however, saw an opportunity to take aim at content policy. It understood that Facebook is invested in its long-term viability, and it bet that the company would prefer to address its policy concerns over delegitimizing the board entirely. So it played chicken with Facebook. If Facebook refused to go along with the board’s demand to make indefinite suspensions less arbitrary, it would risk undermining the whole project.

“Regular users,” Julie Owono, a Facebook Oversight Board member, told me, “should expect transparency, they should expect clarity in the rules, they should expect that the rules are applied the same way to everyone and not in an arbitrary way.”

The board then took it a step further. It decided to push Facebook on its most consequential decisions: how it builds its products. In its report, the board noted that it asked Facebook how its design decisions may have contributed to the January 6 riot. Facebook did not answer the question. It’s not obligated to answer. But the fact that the board is asking these questions from within — using the legitimacy that Facebook itself bestowed upon it — means something. The board, only a few months into its young life, is clearly not interested in playing within the parameters Facebook set up for it.

For all the talk of how Facebook is a victim of groupthink, filled with techno-optimists, and unable to see beyond its own version of reality, there is a world in which this board cuts through the self-mythology and injects the company with a set of critical voices. “They need the guidance, and we’re here to provide it,” Owono said. These are the voices Facebook so desperately needs.

Further reading:

Facebook Oversight Board Member Julie Owono Takes Us Inside the Trump Decision” (Big Technology Podcast)

Facebook’s Made-Up Court Is Better Than No Court at All” (The Atlantic)

Should We Trust the Facebook Oversight Board?” (Big Technology)

News briefs:

Desperation and Dead Ends on Social Media in India” (Bloomberg)

With medical supplies scarce in India, people are turning to social media platforms to secure oxygen and other critical treatments for family members. In this devastating piece, Bloomberg’s Saritha Rai describes what it’s like on the ground, noting not only how the platforms are failing, but how people are desperately trying to make them work better. “Volunteer groups and well-meaning citizens have taken a shot at bringing order to the social media maelstrom, so far with zero success,” she writes. “But perhaps Big Tech’s algorithms could come to the rescue by scrubbing bad information and surfacing verified leads.”

The Untold Story of How Jeff Bezos Beat the Tabloids” (Bloomberg)

Brad Stone has a new book coming out next week about Amazon, and he’ll be joining Big Technology Podcast to discuss it next Wednesday. I’ve devoured the book ahead of the interview, and highly recommend this excerpt. (Yes, it’s a Bloomberg twofer in News Briefs this week.) This portion of the book covers how the sultry messages between Bezos and his girlfriend, Laura Sanchez, made their way to the National Enquirer, and how Bezos masterfully played the ensuing media cycle. I was particularly intrigued by how the narrative that Saudi Arabia was behind the leak — something Bezos himself plays up — falls apart with a bit of scrutiny.

This week on ‘Big Technology Podcast’: Why ex-Google ads boss Sridhar Ramaswamy is building an ads-free search engine

Sridhar Ramaswamy is CEO of Neeva, an ads-free search engine he helped found after running Google’s ads and commerce business. Ramaswamy spent 17 years inside Google and eventually grew disillusioned with its business. Now he’s trying to create the solution with $77.5 million in funding. In this conversation, we discuss his evolving view on advertising, what decoupling search from ads allows from a product standpoint, and how the current antitrust environment is opening Google up to competition.

To subscribe to the podcast and hear the interview for yourself, you can check it out on Apple, Spotify, or wherever you get your podcasts.

Let’s chat

I’d love to hear from you. Please drop your tips, questions, etc., in the comments below.

Silicon Valley-based journalist covering Big Tech and society. Subscribe to my newsletter here: https://bigtechnology.substack.com.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store