Wikipedia Shows How to Handle Political Polarization

Even when editors disagree sharply, the site’s firm rules enable them to produce accurate entries

Oscar Schwartz
OneZero

--

Photo: Hindustan Times/Getty Images

IsIs there an upside to political polarization? A cursory glance at the state of social media would suggest not: Twitter is a cesspool of abuse, Facebook a repository of viral misinformation, and YouTube a broadcaster of conspiracy theories and vicious trolling.

Given the lamentable state of political discourse online, one might expect that Wikipedia — a crowdsourced encyclopedia to which anyone can contribute and edit information — would suffer from similar epistemic rot. But research recently published in the journal Nature Human Behavior suggests just the opposite: on pages where Wikipedia editors are ideologically polarized, the quality and truth of their output actually improves.

The study, entitled “The Wisdom of Polarized Crowds,” comes out of the University of Chicago’s Knowledge Lab, which investigates how groups of people come to know things together. The director of the lab and professor of sociology, James Evans, told me that part of this research project involves assessing the extent to which individual political commitments contribute to a group’s ability to produce knowledge.

They found that when an…

--

--