Wikipedia Shows How to Handle Political Polarization

Even when editors disagree sharply, the site’s firm rules enable them to produce accurate entries

Photo: Hindustan Times/Getty Images

IsIs there an upside to political polarization? A cursory glance at the state of social media would suggest not: Twitter is a cesspool of abuse, Facebook a repository of viral misinformation, and YouTube a broadcaster of conspiracy theories and vicious trolling.

Given the lamentable state of political discourse online, one might expect that Wikipedia — a crowdsourced encyclopedia to which anyone can contribute and edit information — would suffer from similar epistemic rot. But research recently published in the journal Nature Human Behavior suggests just the opposite: on pages where Wikipedia editors are ideologically polarized, the quality and truth of their output actually improves.

The study, entitled “The Wisdom of Polarized Crowds,” comes out of the University of Chicago’s Knowledge Lab, which investigates how groups of people come to know things together. The director of the lab and professor of sociology, James Evans, told me that part of this research project involves assessing the extent to which individual political commitments contribute to a group’s ability to produce knowledge.

They found that when an editing community is politically polarized, the depth and accuracy of the information significantly improves, and inversely, as editing communities become ideologically homogeneous, the quality of the page dramatically declines.

Evans has observed that while many studies show how diversity of socioeconomic background, gender, and race can lead to better group outcomes, the conventional wisdom is that this isn’t the case for extreme political diversity, which is often characterized as polarization. He and a team of three researchers wanted to test this presupposition and assess whether political diversity could sometimes produce better perspectives, in much the same way demographic diversity can.

To do so, they turned their attention to how knowledge is produced on Wikipedia. On every Wikipedia entry there is a “talk page,” the backend where a community of editors try to persuade other editors to accept their contributions. Evans and his team measured the political composition of more than 200,000 of these editing communities, assessing to what extent the editors were split between conservative and liberal ideologies.

They found that when an editing community is politically polarized, the depth and accuracy of the information significantly improves, and inversely, as editing communities become ideologically homogeneous, the quality of the page dramatically declines.

Feng Shi, a co-author of the paper and data scientist at the University of North Carolina atChapel Hill, believes Wikipedia’s capacity to harness diversity is due to the platform’s firm editing guidelines, which demand neutrality, proper citation, and respectful discourse. “Editing on contested topics is like arguing in a court of law,” Shi says. “You have to be tough, have endurance, but most importantly, have a strong commitment to the rules.”

Shi adds that on contentious Wikipedia pages with ideologically diverse editors, these guidelines are fastidiously maintained, with each competing faction using them to hold opponents to account. On Donald Trump’s page, for instance, the editors have collectively agreed that if you add a new fact, and someone corrects it, you can only revert that correction once every 24 hours, in order to prevent spiraling “edit wars.” (As a recent Slate feature made clear, the debates over Trump’s page can get incredibly heated — but imagine how much worse it might be without Wikipedia’s firm rules.)

Justin Knapp, a prolific Wikipedia editor with more than 2 million contributions, agrees that Wikipedia’s robust bureaucracy is crucial to cultivating a space for meaningful disagreement. One of the pages Knapp contributes to concerns the 40-year-long conflict in the Western Sahara, between the Polisario Front, a rebel national liberation movement of the nomadic Sahrawi people, and the Kingdom of Morocco. Knapp is sympathetic to the political cause of the Polisario Front, which he views as an indigenous people fighting for self-determination and decolonization. He often runs up against editors who hold the opposite position. But Knapp says that even when political disagreements are fierce, or seemingly unresolvable, the clear set of editing principles — to cite facts properly, to present information in a neutral voice — allows editors to resolve disputes without devolving into toxic arguments.

“Because of the shared mission, Wikipedia editors generally have an overlap in their set of values. And these values generally override a disagreement on a particular issue.”

More than any specific editing guideline, Knapp believes that it is Wikipedia’s sense of collective purpose that allows it to work so well. “Wikipedia has a serious collective goal and that is to create an encyclopedia,” Knapp says. “Because of the shared mission, Wikipedia editors generally have an overlap in their set of values. And these values generally override a disagreement on a particular issue.”

For Evans, this spirit of collectivity contrasts dramatically with the segregated “echo chambers” of social media, where each dissenting voice can create its own version of reality. “If you disagree with what the Wikipedia page says about climate change, you can’t just create a new, alternative one,” he says. “Because there is a single page on each topic, you have to go to the talk page and try to convince the other editors of your position.”

If Wikipedia’s editorial guidelines and collective spirit fosters political heterogeneity, the platform comes up short on other diversity measures. A 2018 survey found that 90% of Wikipedia editors are men, the vast majority of whom are young, college educated, white, and living in the global north. This impacts Wikipedia’s content, which skews in favor of topics popular with this demographic: Western philosophy, European history, computer games. Lots and lots of computer games.

In response to this homogeneity, the Wikimedia Foundation — the nonprofit that operates and supports Wikipedia and its global community of volunteers — has rolled out diversity initiatives, such as editing training sessions for women across the globe, as well as Edit-a-thons, which started as part of a 2013 strategy aimed at the site’s leveling the content and contributor gender imbalance.

But Rachel Wexelbaum, a Wikipedia editor who promotes LGBTQ+ contributions on the platform, says that demographic homogeneity still acts as a barrier for women and LGBTQ+ folk who want to join the conversation on contentious pages. “While we might be able to add new content about our own community interests, when we try to edit or add something to a popular page, we are often met with aggression from existing editors,” she says. “It begins with an argument in the talk pages, but then it can quickly become flat out rudeness. Sometimes it even turns into harassment and abuse.”

Sydney Poore, the trust and safety specialist at Wikimedia, says she is aware of harassment that takes place in the talk pages, and in response, her team has developed a number of strategies to curtail it, including muting features and blocking functions that allow page administrators to stop a user from contributing to a page where they have behaved contrary to the spirit of “respect and civility.”

While common sense might suggest that the most intense abuse occurs on pages that are ideologically divisive, the Knowledge Lab found that it’s just the opposite. Using an automated text-analysis tool to analyze the talk pages of hundreds of thousands of Wikipedia pages, the researchers discovered that polarized editorial teams actually lead to more civil conversation and that, conversely, politically homogeneous editing communities more regularly turn toxic.

Evans explains that this is because on ideologically uniform pages, a group of editors will sometimes “gang up” on an editor with a dissenting view. But when there is a population of editors that is balanced, yet polarized, each group holds the other to account. “The balanced diversity essentially self-disciplines the editorial process,” Evans says. “This means the conversations go on longer, the debates become more specific, and the rules are appealed to more often.”

For Evans, the broader implication of these findings is that diversity improves not just the output of knowledge production, like a Wikipedia page, but is integral to the process itself. “To come to know things together there has to be a belief that each perspective really cannot contain the whole,” he says. “And to get to that place, we have to believe that diversity really does matter.”

Writer based in New York.

Sign up for Pattern Matching

By OneZero

A newsletter that puts the week's most compelling tech stories in context, by OneZero senior writer Will Oremus. Take a look.

By signing up, you will create a Medium account if you don’t already have one. Review our Privacy Policy for more information about our privacy practices.

Check your inbox
Medium sent you an email at to complete your subscription.

The undercurrents of the future. A publication from Medium about technology and people.

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store