Russian Trolls Aren’t Actually Persuading Americans on Twitter, Study Finds
New research highlights a surprising barrier to hacking our democracy: filter bubbles
We know that Russian agents have been using social media to try to influence other countries’ politics since at least 2013. We know they’ve successfully posed as Americans to post divisive propaganda on Facebook, Twitter, and Instagram, and we know they’ve generated significant engagement on all three platforms. They may have even managed to stage fake political rallies that real Americans attended.
But did they actually change people’s minds? A new study suggests that, at least on Twitter, the answer is no. And while there are limitations to the study’s methods, the authors offer a compelling theory of why that might be the case: The people most likely to interact with Russian trolls are the ones who were already the most entrenched in their partisan views.
The study, led by researchers from Duke University and published on Monday in the Proceedings of the National Academy of Sciences, is the first to directly measure how tweets from Russian agents affected the political views of the Americans who encountered them. The researchers gave a panel of U.S. Twitter users a survey on their political attitudes in October 2017, then asked them the same questions again a month later. Next they looked at which of those users had interacted with accounts controlled by Russia’s Internet Research Agency, or IRA, in between taking the two surveys. They found that those who encountered IRA tweets showed no significant, discernible change in their political opinions, attitudes, or degree of political engagement as a result.
That finding is noteworthy in itself, because while much has been written on the scale, reach, and tactics of Russian bots and trolls seeking to interfere in U.S. politics, there has been little to no peer-reviewed research quantifying their actual impact. That’s notoriously hard to measure, leaving previous studies to grasp at murkier metrics like the number of likes or retweets Russian posts received. The authors were able to do so only because they had already been running a survey of Twitter users’ political views for other reasons, and because Twitter last year published an archive of foreign information operations on its platform. They found that not only did users not change their ideology in response to IRA tweets, they also showed no change in their degree of partisanship, political engagement, or anger at the other side.
There are important caveats, however. The number of users in the panel who were identified as interacting with IRA accounts for the first time within the brief relevant time window was small — just 44 individuals, versus 1,106 who didn’t. Those 44 included some who may have engaged with just a single tweet. It shouldn’t come as much of a surprise that one brief exposure to a Russian bot doesn’t substantially alter one’s political views. It’s also worth noting that the sample wasn’t statistically representative of the U.S. electorate: The users surveyed were self-identified Republicans and Democrats who use Twitter at least three times a week and were willing to share their handle as part of the study. The panel intentionally recruited both “strong” and “weak” partisans from each side so their views could be compared, but left out independents. And of course, Twitter isn’t the same as Facebook or Instagram: It’s possible that Russian accounts had more impact on other platforms, for various reasons.
“The type of people who are most likely to interact with political influence campaigns are the least likely to be influenced by them.”
Still, the research points out something important about how social media users interact with foreign propaganda. In addition to measuring how their views changed, the authors looked at which categories of survey respondents were most likely to have interacted with IRA accounts in the first place. The trait that turned out to be the best predictor of interaction with Russian trolls on Twitter, by far, was the strength of a user’s pre-existing echo chamber. That is, Twitter users who followed political figures almost entirely from their own party were the ones most likely to engage with IRA agents on the platform.
This suggests that Russian agents’ real-world impact on U.S. politics via social media might be more limited than you’d think just by looking at metrics such as the number of followers, retweets, and likes they rack up. And indeed, their impact on political discourse might be limited by the same mechanism that help their posts find traction: the algorithms and filter bubbles that ensure people see political content they were already inclined to agree with.
In short, “the type of people who are most likely to interact with political influence campaigns are the least likely to be influenced by them,” said Christopher Bail, the paper’s lead author, who is a professor of sociology at Duke and the director of its Polarization Lab. That doesn’t mean the tweets had zero effect, but they failed to move the needle on any of six survey questions designed to measure different ways that one’s politics could change.
In a phone interview, Bail also highlighted two other reasons Russian influence campaigns might struggle to make a measurable impact. One is that their content makes up only a tiny fraction of most users’ feeds, if any. Another is that previous research has shown that political persuasion in general tends to be challenging, and the effects of political advertising hard to detect. “We know that most attempts to influence people’s politics fail,” Bail said. “Put in that light, why should we expect Russians to have influence if the most sophisticated American campaigns can’t move voters?”
None of that means we should dismiss the impact of social media influence operations altogether, Bail added. He said the study should be viewed as a first step, one whose findings apply to a specific group of agents targeting users of a specific platform in a specific country at a specific point in time.
It’s conceivable that the effects of Russian interference on Facebook, a much larger platform, were quite different. For instance, there is anecdotal evidence that Russian-controlled Facebook groups managed to organize protests and counter-protests that were attended by real Americans. It’s also possible that Russian operations on Twitter in 2016, before they became public knowledge, may have been more potent than those in the period of 2017 that the PNAS study examined.
Andy Guess, an assistant professor of politics and public affairs at Princeton University who has studied social media misinformation across multiple platforms, told OneZero he finds the study important nonetheless. “This is the best evidence yet on the impact of engagement with IRA accounts on Twitter,” he said. “Since the people most likely to interact with these accounts were embedded in more politically homogeneous networks and more interested in politics already, it makes sense that the researchers were not able to document meaningful effects on individuals’ political attitudes and behavior.” Guess added that a study he co-authored found similarly inconclusive evidence that fake news on Facebook affected voting behavior.
There are all kinds of ways that Russian agents and others might yet be distorting political discourse in the United States via social media. But this is a good reminder that the discourse on social media was plenty distorted to begin with — and that the easiest Americans to reach are the ones whose minds are the least likely to change.