This is an email from Pattern Matching, a newsletter by OneZero.
The Flaw in Facebook’s Vaccine Plan
The company’s campaign to encourage vaccination is fighting against the dynamics of its own platform.
In the most idealistic view of Facebook’s mission, this is the sort of moment it was built for.
With Covid-19 killing thousands of people every day, humanity is in a race to vaccinate enough of the global population to curb the pandemic — ideally before it evolves in ways that make it even harder to contain. One obstacle, of course, is vaccine availability. But another is “vaccine hesitancy:” people afraid or unwilling to get vaccinated when they have the chance.
Facebook has built a network of nearly 3 billion people across its platforms, and has the ability to influence the information they receive about vaccines, including how and where to get them. We know that Facebook can unilaterally make people happier or sadder, or increase voter turnout when it wants to. It stands to reason that it can also affect vaccination rates. In other words, Facebook would seem to have the power, at least in theory, to help the world’s governments and scientists win this race.
Facebook is generally reluctant to take explicit moral stands or throw the weight of its products behind a given agenda. It wants to be a platform for everyone, and imposing its own values risks alienating those who don’t share them.
This week, however, CEO Mark Zuckerberg took just such a stand, announcing that Facebook will launch a global vaccination campaign. It includes, among other things, partnering with VaccineFinder to show users where and when vaccines are available in their area, and give them links to make an appointment. The tool will be integrated into Facebook’s “Covid Information Center” module, which it is putting directly into people’s news feeds — some of the most valuable digital real estate on the entire internet. It builds on Facebook’s existing efforts to limit Covid-19 misinformation on its platform and connect users with authoritative sources.
But before we cheer “Facebook to the rescue!”, it’s worth looking more closely at Facebook’s net impact on the scientific information sphere — and the subtler ways that its products may be inadvertently undermining the same campaign it’s publicly championing.
Facebook tries to tackle vaccine hesitancy.
Let’s look first at what Facebook is doing to try and help. In addition to implementing the VaccineFinder tool, which other tech companies are also integrating into their products, Zuckerberg announced this week that Facebook will:
- Put labels on posts that mention Covid-19 vaccines, with links to vaccine information from the World Health Organization
- Bring the Covid Information Center to Instagram
- Expand support for WhatsApp chatbots that help governments around the world reach people with vaccination info
- Step up efforts to reduce Covid-19 and vaccine misinformation on Facebook, such as temporarily reducing the reach of users who have repeatedly violated its vaccine misinformation policies
- Make some aggregated data on vaccination trends available to public officials
This is on top of the company’s existing programs, which include its February announcement that it would begin removing posts that make false vaccine claims. In general, Facebook and other social platforms have been relatively aggressive about policing Covid-19 misinformation throughout the pandemic. Along with misinformation about voting, scientific misinformation seems to be a realm in which platforms are willing to draw a line because they can plausibly defend that stance as objective rather than partisan or ideological.
But announcing plans to crack down on misinformation should not be confused with successfully stamping it out — or even containing it. My OneZero colleague Sarah Emerson reported earlier this month that anti-vax groups continue to thrive on Facebook in spite of the ban. Australia’s ABC recently found the same Down Under, including anti-vax groups hawking T-shirts on the platform. And USA Today reported that Spanish-language vaccine falsehoods and conspiracy theories remain a persistent problem on Facebook and other networks. These are not a few isolated examples. And they mirror the difficulties Facebook had last year in limiting misinformation about the virus itself.
That doesn’t necessarily mean Facebook isn’t taking the misinformation fight seriously. But it does suggest that the problem runs deeper than its moderation tools are equipped to handle. In short, the dynamics of social media platforms are inherently conducive to misinformation, and particularly to falsehoods and conspiracies that play on people’s deep-seated fears.
Fear of vaccines isn’t new, but research shows that social media has been exacerbating those fears for years now — well before Covid-19. “What was previously a fringe opinion is becoming a transnational movement,” David Broniatowski, a professor at George Washington University, told the Lancet Digital Health in 2019. (The context at the time was a drop in childhood vaccination rates for diseases such as measles.) “The unique thing about social media is that they allow messages to propagate very quickly and for communities to form.”
The good news is that Facebook seems to recognize the problem. This week, the Washington Post’s Elizabeth Dwoskin reported that Facebook is conducting “a vast behind-the-scenes study of doubts expressed by U.S. users about vaccines.” As part of the study, it’s developing machine-learning software to identify posts and comments that express “vaccine hesitancy,” which it tags “VH” in its internal systems. Many of these are comments that don’t break Facebook’s misinformation rules, but may nonetheless be causing harm by spreading unfounded doubts within online communities.
Among the study’s early findings, the Post reported, is that vaccine hesitancy on Facebook is heavily concentrated in certain corners of the social network. And within those corners, a small fraction of users are responsible for its spread. In addition, there is significant overlap between those spreaders of vaccine hesitancy and spreaders of QAnon-related conspiracy content. (I wrote in a previous Pattern Matching about why Facebook has struggled to quash QAnon.)
So far, however, Facebook’s study appears to be mostly reinforcing what public health and misinformation researchers have been pointing to for years. Facebook’s vaccine hesitancy problem stems from the very nature of its networks, which serve to counter traditional media and scientific authority by amplifying messages from unreliable sources to receptive audiences within demographic and ideological echo chambers — particularly when those messages are scary, divisive, or conspiratorial. In other words, Facebook’s vaccine hesitancy problem has the same roots as its hate speech, political misinformation, and polarization problems.
“Repeatedly, and across contexts, researchers are finding that a small number of dedicated disinformers are doing a significant amount of damage to the health of discourse, democracy, and society at large,” tweeted Kate Starbird, associate professor at the University of Washington and an expert in online misinformation, in response to Dwoskin’s story. In a direct message with OneZero, Starbird elaborated on this point. “We reported on this (across platform) in the election 2020 context (around “repeat offenders” or “repeat spreaders”) and we’re seeing the same in the anti-mask discourse on Twitter. … We also found a similar dynamic in the disinformation campaign targeting the White Helmets in Syria. A handful of accounts along with a few media outlets created most of the content and drove most of the spread.”
For Facebook, identifying those superspreaders seems like it would be a good start toward addressing the problem. But it’s unfortunate that the company is just starting now when it could have done this work years earlier and been ready to meet the moment when the pandemic arrived.
This will not be a simple problem to solve. Emerson wrote recently in OneZero about the blurry line between misleading anti-vaccine content and good-faith discussion of people’s concerns — the latter of which can be quite valuable when accompanied by evidence-based moderation. So while Facebook’s interest in vaccine hesitancy is welcome, the binary labeling of content as either “VH” or not in its study suggests that its initial approach may lack nuance.
Finally, there’s reason to worry that Facebook will balk at confronting the deeper roots of its misinformation problem even if this study helps to uncover them. In a fascinating, deeply reported MIT Tech Review story last week, Karen Hao showed how Facebook tends to ignore, discourage, or subvert the work of its own researchers when they find that its problems are linked to the same algorithms and products that power the platform’s growth and engagement. (This problem is not unique to Facebook, by the way.)
In the case of vaccine hesitancy, that could mean Facebook ends up applying Band-Aids rather than making structural changes. For instance, it might trumpet plans to serve pop-ups with links to the CDC in groups it identifies as being hubs of vaccine hesitancy — while leaving intact the underlying algorithms and recommendation systems that treat someone’s newfound interest in, say, vaccine-skeptical content as a signal to nudge them into like-minded groups and down conspiratorial rabbit holes.
To truly address its role in misinformation would require Facebook to reassess its whole value structure to prioritize values such as truth, thoughtfulness, and nuance over engagement. Short of that, the company could at least take the suggestion of venture capitalist Hunter Walk, who argued in OneZero this week that tech companies’ ethics teams need to be empowered to make recommendations that would hurt engagement, if necessary.
It’s good that Facebook is looking for ways to support vaccination campaigns at a critical moment in history. It’s just hard to shake the feeling that we’d be better off, at least from a vaccination standpoint, if Facebook didn’t exist in the first place.
Under-the-radar trends, stories, and random anecdotes worth your time.
- Speaking of the challenges of content moderation, YouTube this week removed a racist video by conservative provocateur Steven Crowder — but not because it was racist. Rather, the video was removed for violating YouTube’s Covid-19 misinformation policies. The company told me that a segment mocking Black farmers and trafficking in ugly racial stereotypes, while “offensive,” didn’t run afoul of its hate speech policies. I wrote about what that tells us about the value judgments platforms are and aren’t willing to make.
- Substack is facing its first big backlash over the deals it offers to lure high-profile writers, and which writers it is offering them to. At least two well-known Substack writers have left the platform, with one blasting it as a “scam” that only works financially for the authors the company recruits directly — a roster it refuses to disclose. TechCrunch’s Anthony Ha has a good summary of the criticisms and Substack’s responses, including this lengthy defense of Substack Pro from co-founder Hamish McKenzie.
- More than half of all U.S. advertising dollars now go to just three companies: Facebook, Google, and Amazon, according to new data from the ad agency GroupM. Thanks to the pandemic, the “triopoly” of internet giants went from dominating the digital ad market to dominating the entire ad market. The Wall Street Journal’s Keach Hagey and Suzanne Vranica broke down how it happened, with an interesting look at advertisers’ decision-making.
- There’s finally a decent, slightly more environmentally responsible way to replace dead AirPods. I wrote in-depth in 2019 about why they’re so hard to reuse or recycle, building on reporting by Motherboard’s Caroline Haskins that called them an environmental tragedy. OneZero editor Damon Beres, who is probably to blame for any portions of this newsletter you dislike, wrote this week about a startup called PodSwap that uses “precision robotics” to perform battery swaps on dead AirPods. In a profile of the company, iFixit’s Kevin Purdy compared it to “a SodaStream canister exchange, but for headphones that might otherwise be headed to a landfill.”
Tech Reads of the Week
— Kashmir Hill, New York Times Magazine
— Leah Nylen, Politico
— Felix Salmon, Wealthsimple
— Emily Birnbaum and Issie Lapowsky, Protocol