Pattern Matching

In Defense of Politics on Facebook

Social networks would love to show users less political content. Here’s why that’s a problem.

Will Oremus
Published in
Sent as a


8 min readFeb 13, 2021


Photo: SAUL LOEB/AFP via Getty Images

Political posts on Facebook and other social networks are often divisive, misleading, or just plain false. Social platforms including Facebook and YouTube have played a role in radicalizing people and facilitated the organization of radical groups, including hate groups, some of which have committed real-world violence.

There is reason to believe that social networks have not merely played passive host to these developments, which have been implicated in the decline of democratic institutions in the U.S. and abroad, but have actively fueled them with feed-ranking and recommendation algorithms that systemically amplify sensational claims and outrage-bait over nuance and balanced reporting.

So if Facebook and other social networks could find a way to show people less political content, that would be a good thing, right?

Unfortunately, it isn’t that simple.

The Pattern

Why social networks can’t put the politics genie back in the bottle.

Facebook announced this week that it will begin running tests in which it reduces the