The End of Social Media’s ‘View From Nowhere’

Trump’s extremism is forcing tech companies to abandon the pretense of political objectivity — for now

Photo: Taylor Smith

There’s an idea in media criticism known as the “view from nowhere.” Popularized by Jay Rosen, a journalism professor at New York University, the phrase takes aim at the ethos of political agnosticism that news outlets have historically cultivated. He argues that “both sides” reporting, which treats competing viewpoints or arguments as equally valid, does a disservice to the truth. Journalism about the “climate debate,” which used to give industry shills equal airtime alongside climate scientists, is a famous example.

Social networks, perhaps Facebook most of all, have long embraced their version of the view from nowhere. CEO Mark Zuckerberg has repeatedly defended the social network as “a platform for all ideas.”

But the deplatforming of Donald Trump, following last week’s violent insurrection at the U.S. Capitol, marks a departure from that approach. So does Facebook’s hiring on Monday of Roy Austin, a civil rights attorney who served in the Obama administration, for a newly created position at the company: vice president of civil rights. And when I remarked on that hire on Twitter — arguing that Facebook could have made such a move five years sooner had it not been so concerned with appearing neutral — a Facebook executive said something I’ve never heard from the company’s top brass.

“We’re not neutral,” Adam Mosseri, head of Instagram, tweeted in reply. “No platform is neutral, we all have values and those values influence the decisions we make. We try and be apolitical, but that’s increasingly difficult, particularly in the U.S. where people are more and more polarized.”

One could spend a lot of energy parsing the distinction between “neutral” and “apolitical.” But it doesn’t take a linguist or critical theorist to see that Mosseri was articulating a very different view than Zuckerberg has espoused in the past. And it just might herald a new willingness by social networks’ leaders to stop using neutrality as an excuse for inaction.

In early 2016, when Facebook’s “Trending Topics” news curators were accused of being biased against conservative outlets, Zuckerberg vociferously denied the charge and went on an apology tour to mollify conservative leaders. He said the company had “rigorous guidelines that do not permit the prioritization of one viewpoint over another or the suppression of political perspectives.”

Then, in 2018, Zuckerberg announced a push to prioritize “high-quality” news in users’ feeds after Facebook was criticized for amplifying political misinformation. But he went out of his way to emphasize that this would be done without Facebook making any value judgments of its own — or even consulting outside experts, who might lack “objectivity.” Instead, the company would establish publishers’ trustworthiness by surveying its users. “We decided that having the community determine which sources are broadly trusted would be most objective,” Zuckerberg explained. (Tellingly, when it turned out that “objective” approach would end up punishing conservative publishers such as the Daily Wire, Facebook reportedly tweaked its formula to limit the backlash.)

A platform for all ideas implies a platform with no standards.

A platform for all ideas. No prioritizing one viewpoint over another. Objectivity above all. Historically, those have been the bulwarks of Facebook’s self-conception as a platform, or at least its messaging, even as it has reluctantly taken a more active role in moderation.

From a business standpoint, that makes perfect sense. Human judgment is expensive. Taking a moral stand can alienate the people who disagree with you. Better, Facebook figured, to embrace “free speech” as its primary value, and thereby excuse itself from responsibility for the content it amplifies.

From a societal standpoint, however, the moral agnosticism of Facebook and other social networks has been costly. While Facebook professed to be a platform for all ideas, the dynamics of its news feed ensured that the ideas that flourished would be the ones most likely to induce clicks, likes, and comments; the ones that could be understood in the space of a single headline; the ones that played on people’s fears and biases; and the ones that gave people simple answers to complex problems.

A platform for all ideas implies a platform with no standards. In practice, it means becoming a haven for ideas that have been rejected elsewhere. That can include valuable political and social dissent, but it inevitably also includes misinformation, hate speech, and baseless conspiracy theories. Facebook’s view from nowhere helped to deliver us Donald Trump, QAnon, and last week’s violent insurrection at the U.S. Capitol.

In reality, Facebook has always had its red lines, such as pornography and terrorism. Recently, it has been drawing more of them, taking action against Covid misinformation and false information about elections. This fall, it finally cracked down on QAnon, though not without a matching crackdown on left-wing groups to provide political cover.

Until now, however, it has been careful to draw those lines only on issues that most conservatives and liberals could agree on, to avoid getting in trouble with either political party. I suspect that’s what Mosseri meant when he said the company tries to be “apolitical.” It’s why Facebook bent over backward for four years to mollify Trump, crafting tortuous exemptions to its policies to accommodate his lies, while insisting that it was really about the newsworthiness of political figures’ posts.

To ban Trump, Facebook had to not only bite the bullet of breaking with the American right, it had to go back on its own rules and precedents — the ones it had created to accommodate him. In other words, it had to take a moral stand, even a political one.

Three years ago, Tarleton Gillespie, a principal researcher at Microsoft Research who studies platform moderation, wrote against the “myth of the neutral platform” in his book Custodians of the Internet. When I asked him Tuesday if he thinks the Trump ban means that platforms are finally embracing a view from somewhere, he was cautious.

“This feels like a moment that, depending on the fallout, might release platforms from needing to defend themselves as neutral,” Gillespie told me. “But, we’ve seen these breaks before. It could have happened after Gamergate, after Trump’s election, after Alex Jones, after Christchurch.”

So why didn’t it?

“There’s so much pressure back towards that position of neutrality, from the way U.S. information policy privileges it, to the way charges of bias from the right force them to hold to it,” Gillespie said. “In the long run, I think the platforms will only let go of the claim of neutrality when they can settle on a new, equally defensible principle. Some have suggested fairness, some have suggested accountability, but so far these haven’t yet been sturdy enough.”

Exactly what values social networks ought to embody is a hard question. How they could plausibly do so in practice might be a harder question still. But if Mosseri’s tweet is an indication, it’s at least a question that some of social media’s leaders are finally prepared to start asking.

Senior Writer, OneZero, at Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store