Why I’m Joining Facebook’s Oversight Board

Former Guardian editor Alan Rusbridger on why the social media giant ‘needs independent, external oversight’

alan rusbridger
OneZero
5 min readMay 6, 2020

--

Photo: NurPhoto/Getty Images

Alan Rusbridger was the editor in chief of The Guardian between 1995–2015. He is Principal of Lady Margaret Hall; Chair of the Reuters Institute for the Study of Journalism; and the author of Breaking News: The Remaking of Journalism and Why It Matters Now.

Almost exactly a year ago, back in the days when near strangers could strike up random conversations in Italian bars, I found myself learning about a new initiative on which Facebook was embarking — a kind of independent Supreme Court to help the company rule on the deluge of moral, ethical, editorial, and legal challenges it was facing.

We were in Perugia, the medieval hilltop Umbrian city where each year hundreds of journalists, technologists, and academics gather (or did, until Covid temporarily silenced those conversations) to talk all day — and sometimes much of the night — about the issues we all had in common.

In the bar of the Brufani Hotel late one evening, I asked lots of questions about this Facebook Oversight Board, an idea Mark Zuckerberg had announced the previous November. It seemed a promising move by a company which was exasperating and alienating so many people by its apparent unwillingness, or inability, to get grips with the torrent of lousy, malign content it was enabling and amplifying. As well as all the good stuff.

But the devil would lie in the detail — and there wasn’t much of that a year ago.

Facebook’s operating principle is to move fast and break things. In creating this Oversight Board, the company has proceeded extremely slowly and tiptoed over eggshells.

I am told there were more than 2,000 conversations to put together a board that was suitably global, diverse, eclectic, independent, and experienced. There have been charters written; bylaws drafted; a separate trust established; ring-fenced funding put in place; numerous hypothetical “trials” run. A big, bold idea like this can’t afford to fail.

Today, the company has unveiled the names of the first 20 members of a Board which should eventually grow to around twice that size. I’m one of them. The range and caliber of my new colleagues is impressive — a fascinating mix of people from all over the world who have spent their professional lives thinking about free speech, human rights, journalism, law, ethics, and technology.

Why have I agreed to join?

The global Covid-19 crisis we’re currently living through exemplifies the mortal dangers of a world of information chaos. Societies and communities can’t function unless there is some consensus around facts and truth. And the coronavirus is, in some ways, merely a dress rehearsal for the even greater challenges of climate change.

At the same time, there is a crisis of free expression — with oligarchs, populist leaders, and some corporations trying to delegitimize and repress the voices of those who would challenge them. Finally, there is a crisis of journalism: both the economic model which sustains it, and in the generally low levels of trust much of it enjoys.

Facebook sits at the heart of these interlocking crises — and it’s not hard to see why it’s tied itself in knots trying to solve even some of them.

Facebook is an entity that defies description. It is a friend of the otherwise voiceless — but also an enabler of darkness. It brings harmony to some, discord to many. It promotes order and amplifies anarchy. It employs many brilliant engineers but has — too slowly — recognized that the multiple challenges it faces involve the realms of philosophy, ethics, journalism, religion, geography, and human rights. And it makes a whole lot of money, and a whole lot of enemies, while doing this.

To address this, it needs independent, external oversight.

Government-led regulation of free speech is nearly always problematic. Regulating such an entity — which operates in all but a handful of countries in the world — is extremely complex. Add in the scale of the platform — with more than 2 billion monthly users — and it’s little wonder that there have been few quick fixes. The current regimes in Hungary, Russia, Poland, Pakistan, Brazil, or Turkey — to name but a few — would dearly love to “regulate” Facebook. We may have some sense of what would be lost.

The idea of the alternative — some form of independent, external oversight — apparently grew out of multiple conversations and a thousand op-eds. One such discussion, in January 2018, involved a Harvard Law Professor, Noah Feldman, who had struck up a dialogue with Mark Zuckerberg. Both men agreed that, whoever should be making some hugely consequential decisions about the information which half the connected people on the planet were plugged into, it probably shouldn’t be Mark Zuckerberg.

In the eyes of some, the fruits of those deliberations — the Oversight Board — is one of the most significant projects of the digital age, “a pivotal moment” in the words of Evelyn Douek, a young scholar at Harvard, “when new constitutional forms can emerge that will shape the future of online discourse.”

Others are unconvinced. Some, inevitably, will see it as a fig leaf.

Another Harvard academic, Dipayan Ghosh, believes the Oversight Board’s powers are too narrowly drawn. He thinks the Board’s authority should be expanded from content takedowns to the more critical concerns at the heart of the company itself. “We need oversight of the company’s data practices to promote consumer and citizen privacy,” he has written, adding: “oversight of the company’s strategic acquisitions and data governance to protect against anticompetitive practice; and oversight of the company’s algorithmic decision — making to protect against bias.”

In short, why stop there?

Feldman anticipated that — if the idea worked — there would, of course, be calls for the Board’s influence to be expanded. In an interview with Fast Company’s Mark Sullivan in August 2019, he said: “My own advice to [Zuckerberg] has always been — and I think he gets this — let’s launch this one very innovative thing, let’s make sure it actually works, and then if it actually works; and if it’s truly independent and truly legitimate; and if it’s seen that way, then we can add more content to its scope.”

That’s a lot of “ifs.” But he added: “The whole point, the point that Mark gets, is that Mark shouldn’t decide that! It shouldn’t be up to Mark.”

Proof, puddings, and all that. Facebook has committed to implementing the Board’s decisions — so how diligently it does that will be seen as one early test of the fig leaf question.

There are, one should grant, not many media companies which would suggest, and seriously endow, an independent board of people to rule on significant aspects of their content — and to promise both full transparency of the process and to implement its rulings. It’s entirely reasonable to be skeptical, while also holding out hope that this new entity might have a real effect.

For 20 years I edited a fast-growing and dramatically changing news organization — The Guardian. Some of the challenges Facebook is grappling with are familiar, albeit on a vastly different scale. Others are issues that no one has ever, in history, had to think about.

Will it work? Let’s see. There is, in my view, no excuse for not trying. The balancing of free expression with the need for a better-organized public square is one of the most urgent causes I can imagine.

--

--

alan rusbridger
OneZero

Principal, Lady Margaret Hall, Oxford. Former editor, The Guardian . Chair, Reuters Institute. Author, Play it Again, Breaking News https://www.arusbridger.com/