How a Crisis Researcher Makes Sense of Covid-19 Misinformation

Collective sensemaking during times of uncertainty and anxiety

Photo: Robert Nickelsberg/Getty Images

MMany of us are struggling to absorb the news about the novel coronavirus, the virus that causes Covid-19. Lives have been lost. People are gravely ill. Others have been quarantined for weeks. The disease appears to be spreading in numerous countries, including here in the United States. We are facing what has now been labeled as a global pandemic. And many of us are trying to figure out what actions to take to protect ourselves, our families, and our communities.

Our information feeds, from television sets, internet searches, and social media, provide continuous updates about the unfolding crisis — some of them accurate, some of them seemingly less so. Though crisis events like this one have always been times when rumors and misinformation spread, the problem seems especially acute now, with the rise of the internet, the widespread use of social media, and the pervasive politicization of just about everything.

Tedros Adhanom Ghebreyesus, director-general of the World Health Organization (WHO), noted in a February 15 address that we are fighting not only an epidemic but also what he called an infodemic. And indeed, numerous cases of false information about the virus are already spreading online — sometimes intentionally, sometimes not. But perhaps worse than that, it is increasingly difficult for us to figure out which information we should trust.

As a person trying to understand the situation and make the best decisions for my family and my community, I am struggling with this myself. I have been reflecting upon the Covid-19 crisis and the parallel infodemic from two very different perspectives: as a researcher of crisis informatics and as a person living in an affected area (Seattle) with family members in vulnerable groups. There’s a tension between these different perspectives, which motivated me to write this post.

When information is uncertain and anxiety is high, the natural response for people is to try to “resolve” that uncertainty and anxiety.

Let me give you some background. For more than a decade, I have been studying and conducting research in crisis informatics, first as a PhD student at the University of Colorado working with Leysia Palen, whose research program helped to establish this field of study, and now as a faculty member at the University of Washington with an amazing team of students and colleagues.

Crisis informatics is the study of how information flows during crisis events, especially how information flows across what we call “technology mediated” environments, like the internet and social media. It is also the study of human behavior — in other words, how people respond to crisis events. It builds from previous research in the sociology of disaster, which reaches back to the 1960s. And it integrates insight from the psychology of rumor, another field with a long history.

Many of the lessons from these related fields are relevant to conversations we are all having right now about Covid-19.

Crisis events such as natural disasters, industrial accidents, terrorist attacks, and emergent pandemics are often times of high uncertainty — about what is happening and what we should do about it. In these cases, there are often information voids—things we just don’t know yet. And the “facts” of the situation are dynamic. In other words, they change as new information forces us to update our understanding of what is going on.

This uncertainty feeds anxiety — about the personal and collective impacts of the event, as well as about what actions we should take. What should we do? Should we travel? Should we go to work? Should we visit our parents or grandparents? Should we move them out of the elder living facility and into our home? If so, when? Is now too early? Will tomorrow be too late?

When information is uncertain and anxiety is high, the natural response for people is to try to “resolve” that uncertainty and anxiety. In other words, to figure out what is going on and what they should do about it. And so we attempt to come together — either in physical spaces or using communication tools like our phones and now our social media platforms — to “make sense” of the situation. We gather information and try to piece together an understanding, often coming up with, and sharing, our own theories of causes, impacts, and best strategies for responding. And these theories inform the decisions we make and the actions we take. Researchers talk about this activity as “collective sensemaking” and consider it a natural part of the human response to disaster — with informational and psychological benefits. I imagine most of us are participating in collective sensemaking right now.

However, the sensemaking process can also produce rumors, including rumors that turn out to be true and rumors that turn out to be false. False rumors (or misinformation) are dangerous, because they can cause people to make the wrong decisions, including decisions that endanger themselves or others.

Historically, the biggest challenge for communities experiencing a crisis event was often a lack of information, especially information from official sources. In that void, people would share information with their families, friends, and neighbors to try to make the best decisions. In the connected era, the problem isn’t a lack of information but an overabundance of information and the challenge of figuring out which information we should trust and which information we shouldn’t trust.

This challenge increases when we lose our trust in “official” sources, such as the government agencies charged with managing the response. That is why it is so critical for those agencies to share the best information at the time (from experts), to be consistent, and to avoid the appearance of being politically partisan.

When elected leaders share dubious information and contradict their own agencies and scientists, this lowers our trust in official response agencies and reduces our ability to identify the best information at the time. These also increase uncertainty and anxiety and potentially cause people to take actions that may be harmful to themselves or others.

Another thing researchers have long known about crisis events is that a few people will use those events to exploit affected people and communities. In our early research (2009–2012), we saw a few cases of people intentionally spreading misinformation to solicit donations to a fake organization or just to get attention and gain some social media followers. Over time, these exploitative behaviors have grown. We now see the intentional spread of disinformation for financial and/or political gain during every crisis event in larger and larger volumes. And Covid-19 is no different.

This challenge increases when we lose our trust in “official” sources, such as the government agencies charged with managing the response.

In times of high information uncertainty and anxiety, we are particularly vulnerable to disinformation, which can take root within the collective sensemaking process. And as “participants” in online information environments, as we all are right now, this means we can end up both absorbing and spreading it.

Taking these lessons from crisis informatics into account, I offer a few recommendations.

First, I ask us as information participants to tune in to how our anxiety fuels information-seeking and information-sharing activities that may make us susceptible to spreading false rumors and/or disinformation. This can mean slowing down. It can mean doing a better job of vetting our sources — perhaps using the SIFT technique being pioneered by my colleague Mike Caulfield. It can also mean choosing not to share content that we’re just not sure about. And it can even mean stepping away from our feeds when we realize they aren’t helping us resolve the anxiety and uncertainty and are just amplifying them. We can think about this as the “hand-washing” for the infodemic accompanying the pandemic.

Second, I recommend to crisis communicators that they rely on the growing knowledge of experts (for example, medical professionals and epidemiologists) and work to remain consistent across their agency or agencies. It is also important that they effectively communicate the inherent uncertainty of the event and help people understand that the “facts” may change over time as we learn more.

Finally, I implore political leaders and political communicators to reflect upon how they may be contributing to the problem — by spreading misinformation and disinformation and by casting doubt on the science and recommendations of experts within our response agencies. This may have detrimental effects, both immediate and over time, on individual and collective responses to the crisis event.

Covid-19 is a public health crisis. Responding to it may require every one of us to take specific actions to protect ourselves, our loved ones, our neighbors, our communities, and society at large. To inform those actions, it’s critical that we can find, and recognize, good information we can trust.

Associate Professor of Human Centered Design & Engineering at UW. Researcher of crisis informatics and online rumors. Aging athlete. Army brat.

Sign up for Pattern Matching

By OneZero

A newsletter that puts the week's most compelling tech stories in context, by OneZero senior writer Will Oremus. Take a look.

By signing up, you will create a Medium account if you don’t already have one. Review our Privacy Policy for more information about our privacy practices.

Check your inbox
Medium sent you an email at to complete your subscription.

The undercurrents of the future. A publication from Medium about technology and people.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store