Anti-Vaccine Messaging Is Well-Connected on Social Media

A new social network map shows a well-connected anti-vaccine movement, now intertwined with coronavirus conspiracy theories.
Image
A visual display showing how Facebook pages about vaccines interact.

In this visual representation of a network of vaccine-related pages on Facebook, blue represents pages expressing pro-vaccine sentiments, red represents pages expressing anti-vaccine sentiments, and green represents pages that are interested in vaccines, but don't lean in either direction.

Media credits

N. Velasquez and N.F. Johnson

Marcus Woo, Contributor

(Inside Science) -- A video dubbed "Plandemic" that brought together unsubstantiated and debunked claims and conspiracies about the coronavirus, featuring a discredited virologist who is also aligned with the anti-vaccine movement, gathered millions of views last week.

Social media platforms have since removed the video for violating misinformation policies, but the 26-minute video highlights one way that the anti-vaccine movement is feeding into the recent surge of misinformation and disinformation swirling around COVID-19.

Now a new study further underscores the challenges of curbing misinformation, showing that efforts to promote accurate medical information about vaccines on Facebook are failing to reach wide audiences. If people were to resist taking a future coronavirus vaccine that's shown to be safe and effective, it could threaten the ability to corral the pandemic.

"You may have the best pandemic response planned, but if you don't have the trust and communication ability to make people want to comply with the response activity or take your vaccines, you're back to square one," said Tara Kirk Sell, who studies public health communication and misinformation at Johns Hopkins University in Baltimore.

The new analysis, published today in the journal Nature, provides the first map of how three types of online communities interact: those who promote accurate information about vaccines, those who are against vaccines, and those who are interested in vaccines but don't obviously lean in either direction, whom the researchers termed undecided. Such a map is crucial for effectively promoting accurate information about vaccines, said Neil Johnson, a physicist at George Washington University who led the research.

The map is "fantastic to see," said Renée DiResta, a researcher at the Stanford Internet Observatory who studies misinformation. "For a very long time, we didn't have visibility into that kind of Facebook data."

Other studies, including her own, have mainly focused on Twitter, whose roughly 275 million monthly users are far less than Facebook's estimated 2.6 billion.

To create the map, the researchers surveyed more than 1,000 Facebook pages -- public pages that represent organizations, causes, public figures or communities -- that frequently discuss vaccines, classifying them into pro-vaccine, anti-vaccine or undecided categories. Pages are connected when one page likes another, forming a vast network. On the map of the network, each connection acts like a spring, so a group of pages linked to one another will have many springs that pull them closer together. Less well-connected pages are linked by only a few springs, so will remain farther apart on the map. Altogether, the researchers said, the network of pages includes nearly 100 million Facebook users.

To Johnson's surprise, he said, pro-vaccine pages were not well connected, relegated away from the network's central hub, where anti-vaccine pages mingled with undecided pages. Although anti-vaccine pages represent only 4.2 million total users compared to the 6.9 million pro-vaccine users, there are more than twice as many of them than there are pro-vaccine pages, with 317 compared to 124. More pages mean more opportunities to link to -- and influence --undecided pages and their followers.

But the isolation of pro-vaccine advocates is similar to what DiResta has seen in Twitter data, she said. And the difficulty in reaching people is a challenge that public health communicators have long been grappling with, Sell said. Although a recent survey from the Pew Research Center shows that a large majority of Americans agree that the benefits of the MMR vaccine outweigh any risks, even a small unvaccinated minority can influence public health, as experts say achieving herd immunity against measles requires that more than 93% of people be vaccinated.

Contrary to what some might think, the undecided pages are not passive participants waiting to be convinced one way or another, Johnson said. They "were the ones making a lot of the links," he said. "They're looking for information." And much of that information seems to be coming from anti-vaccine pages.

Indeed, according to the researchers' mathematical model, anti-vaccine pages will dominate the network of pages in a decade. To stem the tide, however, the researchers suggest that this kind of map can help public health advocates target individual pages or groups of nearby pages. Facebook itself could also adjust its algorithms to deprioritize posts that contain misinformation and originate from pages with an anti-vaccine bent.

Social media platforms now try to rein in misinformation by focusing on certain keywords, DiResta said. But this kind of map could give platforms another strategy by targeting the connections and dynamics of a network.

While the analysis focused on vaccinations, the implications go far beyond that, Johnson said. The analysis was done in September 2019, before the pandemic, but it's especially concerning to public health experts now, as anti-vaccine activists have been a growing presence at protests calling for the end of the statewide stay-at-home orders -- a mixture of messages that could undermine efforts to slow the pandemic.

Anti-vaccine activists are reframing their message to appeal to the belief system of a particular audience, DiResta said. For example, she said, the "Plandemic" video is rooted in the anti-vaccine movement but recast as a conspiracy theory aimed at those who want to reopen the country.

Such reframing is also consistent with what Johnson and his colleagues found. Anti-vaccine pages often discuss a variety of topics and narratives. "They have a lot of flavors," Johnson said, making it easier for undecided people to find an anti-vaccine message that resonates with them. In another paper published this week in the journal IEEE Access, the researchers found that anti-vaccine Facebook Pages also discussed many flavors of COVID-19 topics.

One flavor can be hate. Racism against Asians and Asian Americans related to COVID-19, for example, has already spilled into real-world violence. And white supremacists are exploiting the pandemic to promote their cause. In a forthcoming study submitted to a journal, Johnson and his team tracked how misinformation, hate and other malicious content surrounding COVID-19 spreads from one social network platform to another.

"This explains exactly why no platform can take care of hate or misinformation by itself, because it leaks out across platforms," Johnson said.

With so much that's uncertain and unknown about the coronavirus, the public is hungry for information, even if it's unverified, Sell said. "There's an unknown that can be an opportunity for this false information to come in and really solidify itself in the minds of people."

(Editor's Note: This story was republished on December 23, after Inside Science learned that the text had been inadvertently deleted earlier this year.)

Author Bio & Story Archive

Marcus Woo is a freelance science writer based in the San Francisco Bay Area who has written for Wired, BBC Earth, BBC Future, National Geographic, New Scientist, Slate, Discover, and other outlets.