Anti-Vax Groups Thrive on Facebook as Nationwide Coronavirus Vaccinations Begin
One mother asked whether a snakebite kit could be used to remove the vaccine from the body
On Monday, a member of an anti-vax Facebook group with 13,000 participants shared a video by the World Doctors Alliance, a controversial group of health professionals that pushes conspiracy theories and vaccine misinformation and has falsely denied the existence of a pandemic. At one point in the video, a man identifying as a medical doctor and homeopath in Belgium said, “There are strong indications it could make you a controllable puppet,” referring to newly developed Covid-19 vaccines.
His false claim asserts that lipid nanoparticles in the Pfizer-BioNTech vaccine contain tiny robots that “might possibly change your DNA.” This discredited conspiracy theory has been floating around Facebook since November and is one of many hoaxes propagated by members of the private Facebook group, which was created in 2016 and describes itself as “opposed to deadly vaccinations.” It and other Facebook groups OneZero identified have been incubating dangerous medical misinformation for years and continue to promote anti-vax beliefs on the platform as the United States begins its monumental Covid-19 vaccination rollout.
OneZero easily located several blatantly anti-vax groups on Facebook this week, the largest of which has roughly 50,000 members. In addition to calling themselves “Anti vaccine” and “Vaccines exposed,” these groups also include ones dedicated to alleged vaccine injuries and the “parent’s choice” movement, which supports the notion that parents should be able to refuse vaccines for their children. Anti-vax ideology has also infiltrated other spaces on the platform, such as wellness and birthing groups, making it an insidiously difficult topic to moderate, Rolling Stone recently reported.
His false claim asserts that lipid nanoparticles in the Pfizer-BioNTech vaccine contain tiny robots that “might possibly change your DNA.”
Facebook has struggled to eradicate vaccine misinformation for years. On December 3, the platform announced that it “will start removing false claims about these vaccines that have been debunked by public health experts on Facebook and Instagram.” This could include misinformation about vaccine safety, what goes into a vaccine (one myth suggests vaccines contain microchips), and suggestions that populations are being administered the vaccine without their consent. In October, the company said it would also reject ads “that discourage people from getting a vaccine.”
“We will not be able to start enforcing these policies overnight. Since it’s early and facts about COVID-19 vaccines will continue to evolve, we will regularly update the claims we remove based on guidance from public health authorities as they learn more,” Facebook added this month.
Facebook did not immediately respond to OneZero’s questions about how it’s enforcing these new policies in groups, which clearly present a moderation challenge for the company. The platform largely relies on community policing in groups (as well as some A.I. detection tools), but it’s unclear how successful those measures can be when these spaces are managed by anti-vax adherents.
“How in the world do you get the junk out of your body,” a mother in one of these groups wrote last week. She was seeking advice for removing the Covid-19 vaccine after it’s been injected and wondered whether a snakebite kit could be used.
In a different group, also last week, one man posted a video falsely claiming that 87,000 doctors and nurses had opposed the Covid-19 vaccine. The video has been discredited and removed from YouTube for violating its terms of service, but it continues to exist on video hosting platforms favored by the far right, such as BitChute, where it is subsequently shared on Facebook.
Last year, Facebook similarly promised to prioritize anti-vax moderation as some places in the United States experienced measles outbreaks, spurred in part by vaccine opposition. The company said it would no longer allow vaccine misinformation to be promoted through ads or recommendations and would demote anti-vax content in search results. This January, BuzzFeed News discovered anti-vax ads still running on the platform. And while Mark Zuckerberg has personally stated that “vaccinations work,” Facebook has never gone as far as issuing a blanket ban on claims that they don’t.
Though it’s understandably hard for Facebook to catch every piece of misinformation, its fractured vaccine policies make it difficult to understand when and why the platform is taking action. Facebook’s hesitancy to remove certain anti-vax content rustled lawmakers last February, who implored Facebook to interrogate its response to the harmful issue. And the consequences of Facebook’s inability to fully minimize these conspiracy theories has been documented by researchers and journalists throughout the pandemic.
“Admins: Is this group on Parler yet?! We need to start making the switch to make sure we don’t leave anyone in the group behind.”
Regardless of Facebook’s latest efforts, anti-vax communities are already preparing to be deplatformed. Conversations in many of these groups include talks about migrating to Parler or MeWe, which have become safe havens for far-right and extremist movements.
“Facebook is cracking down on vaccine groups and deleting them,” a member of one of these groups wrote in early December. “We need a back up plan so we can all stay connected. Admins: Is this group on Parler yet?! We need to start making the switch to make sure we don’t leave anyone in the group behind.”
“With the new threat of censorship on many social media platforms there is a concern that this page will get pulled down… I urge you to please take 5 minutes and make either a MeWe or Parler account and follow this group,” wrote one member of a different group.
On Wednesday, Twitter announced that it will remove tweets promoting “[f]alse claims which have been widely debunked about the adverse impacts or effects of receiving vaccinations.” This policy begins next week. In 2021, Twitter will begin labeling tweets “that advance unsubstantiated rumors, disputed claims, as well as incomplete or out-of-context information about vaccines.”