Anti-Vax Groups Still Thrive on Facebook, Despite Content Ban
Tens of thousands of Facebook users participate in groups that spread misinfo about vaccines — some with an odd religious bent
A month after Facebook said it would expand efforts to scrub its platforms of vaccine misinformation, false narratives about the Covid-19 vaccine are still flourishing in public and private Facebook communities.
OneZero found dozens of anti-vax groups, public and private, some of which have tens of thousands of users. The sheer abundance of anti-vax material in Facebook groups suggests that the company’s current tools and strategies aren’t enough to tackle even surface-level vaccine misinformation.
Facebook has not released a progress update since its blog post last month, and it did not immediately respond to a request for comment.
In a group called “Covid-19 vaccine adverse reaction testimonials,” one member bizarrely accused the pharmaceutical company AstraZeneca of “facilitating Satanic ritual.”
Facebook’s renewed pledge to eradicate vaccine hoaxes and misinformation was informed by a January ruling from the company’s oversight board, an internally funded third-party review body. In January, the board critiqued the application of Facebook’s vaccine policies as “inappropriately vague,” and pressed the company to create new community standards on health misinformation. According to the New York Times, Facebook responded by declaring it would focus on the spread of harmful vaccine content across groups and pages, which have long posed a pernicious moderation challenge for the platform.
Why I’m Joining Facebook’s Oversight Board
Former Guardian editor Alan Rusbridger on why the social media giant ‘needs independent, external oversight’
Despite this vow, dangerous conspiracy theories continue to bubble up in vaccine-related Facebook communities.
In a private group of 27,000 members called “MTHFR Connections: Tongue Ties, Autism, V@xynes, Leaky Gut,” (which used “Vaccines” rather than “V@xynes” in its title until last month) users have shared thoroughly debunked misinformation about Covid-19 vaccines, such as the false claim that they alter human DNA.
The stakes have never been higher for Facebook.
In a post this weekend, one member falsely claimed that Covid-19 vaccines issued by the Food and Drug Administration (FDA) for emergency use last year were unsafe. The user shared an image of a billboard in New York’s Times Square last month, which encourages people to report vaccine reactions to a website operated by the FDA, Centers for Disease Control and Prevention, and the Department of Health and Human Services. Though the website is meant for legitimate reporting of “adverse vaccine events,” the digital billboard was erected by an anti-vax parents group known for pushing unfounded claims, such as the false assertion that vaccines cause sudden infant death syndrome, or SIDS.
Other groups identified by OneZero are dedicated to the topic of vaccine experiences, a complicated subject not covered by Facebook’s existing guidelines. As OneZero reported last month, allowing people to talk about vaccine experiences can actually help vaccine acceptance, and remains a crucial component of public health efforts. However, if not properly moderated — a task often left to unpaid and untrained group administrators — these groups can easily be preyed upon by anti-vaxxers.
In a group called “Covid-19 vaccine adverse reaction testimonials,” one member bizarrely accused the pharmaceutical company AstraZeneca of “facilitating Satanic ritual” through its vaccine. Another user claimed a vaccine recipient could no longer “feel God” after being vaccinated.
Other attempts to spread anti-vax material are subtler, but still effective. In the group “Covid Vaccine Experiencers,” one member asked people to share their experiences of either getting vaccinated or considering it. In the comments, users posted numerous links to pseudoscience and anti-vax propaganda websites. One user shared a meme that said “The side effects are your body’s way of telling you that it’s being poisoned.” Another said of the vaccine, “Not doing it they’ll have to force it with a gun to my head!”
In the past, Facebook has touted algorithmic solutions for curbing harmful content in private groups. “Increasingly, we can use AI and machine learning to proactively detect bad content before anyone reports it,” the company wrote in a 2019 blog post. But the bulk of moderation is performed by group admin and moderators who, apart from tools that allow them to flag and remove keywords, must dedicate a significant amount of time to running these communities, especially as they scale.
Nevertheless, the stakes have never been higher for Facebook, as humanity enters year two of the pandemic, and as countries forge full steam ahead with vaccination rollout campaigns. Already, the physical consequences of Facebook misinformation are materializing. A January vaccination protest at Dodger Stadium in Los Angeles was organized on Facebook, for example.
Facebook is reportedly now in talks with President Biden’s White House to assist with federal efforts to combat vaccine misinformation, according to Reuters.