clock menu more-arrow no yes mobile

Filed under:

How Covid misinformation stayed one step ahead of Facebook

Study: When Facebook removes vaccine misinformation, anti-vaxxers quickly regroup.

Facebook’s App store listing is displayed on a phone. The blue Facebook logo is projected on a screen in the background.
A new study tracked how anti-vaccine pages and groups responded to Meta’s efforts to take more aggressive moderation action against Covid-19 misinformation.
Jakub Porzycki/NurPhoto via Getty Images
A.W. Ohlheiser is a senior technology reporter at Vox, writing about the impact of technology on humans and society. They have also covered online culture and misinformation at the Washington Post, Slate, and the Columbia Journalism Review, among other places. They have an MA in religious studies and journalism from NYU.

The work of trying to minimize the influence of harmful misinformation is both exhausting and essential. Big pushes, like the one Meta undertook in late 2020 to begin removing more misinformation about Covid-19 vaccines while promoting content from authoritative public health and scientific sources, always seem too late and undertaken in response to public or institutional pressure. And they require a sustained effort that platforms don’t always seem willing to maintain. A question has always lingered in the background of these big public moments where major platforms get tough on online harms: Did these efforts actually work?

A new study, published this week in Science Advances, argues that Meta’s Covid-19 policies may not have been effective. Though Meta’s decision to remove more content did result in the overall volume of anti-vaccine content on Facebook decreasing, the study found that engagement may have “shifted, rather than decreased” outright.

Using data from CrowdTangle, researchers tracked content from a number of public pages and groups that posted content focused on vaccines, sorted into “pro” and “anti” vaccine sources. Their data, they said, indicates that anti-vaccine influencers know how to dodge enforcement at every level of Facebook’s infrastructure, allowing followers to continue to access their content by taking advantage of Facebook’s built-in amplification of content users might want to engage with and of the vast, inter-platform networks of communities, influencers, and tactics that the anti-vaccine movement has built online over time.

Health misinformation on social media needs to keep moving to stay alive, dodging platform enforcement by changing keywords and adapting euphemisms, or funneling the believers and the curious into newer groups or platforms where their posts are less likely to be removed. Anti-vaccine influencers are skilled at this because they’ve had a lot of practice. By the time Meta began rolling out more robust policies in 2021 that were designed to minimize the influence of misinformation about Covid-19 vaccines, anti-vaccine communities had been building their strategies to remain visible there for years.

Anti-vaccine content had the agility to outpace policy shifts on multiple levels, the paper argues. Public anti-vaccine pages can build connections with each other, and are sometimes run by the same influencers. When one disappears, other connected groups can simply step in and continue to post. That structure can also help members of banned groups find the next, newer version of that space, or link outward to platforms that are more accepting of conspiracy theory-laden content. And finally, individual members of these communities have an awareness of the importance of engagement. Anti-vaccine influencers ask for likes and shares in order to maximize their visibility on Facebook, and the believers seem to respond accordingly.

“There’s a broader ecosystem and there’s demand for this content,” said David Broniatowski, one of the study’s authors and an associate professor of engineering at George Washington University. “You can remove specific articles or posts or instances of the content. But the fact that you didn’t see a change in the engagement for the content that remained [on Facebook] goes to show the fact that people are out there and they want this stuff.”

When people do start to seek out anti-vaccine content in the wake of a moderation takedown, the study argues, they might also find themselves being pulled into more extreme spaces. As Twitter, YouTube, and Facebook began taking down more anti-vaccine content, researchers saw an increase in links to alternative social media platforms like BitChute, Rumble, and Gab, which are popular with far-right and white supremacist users who might face account bands on mainstream social media sites.

Some of the research here echoes key points that trackers of anti-vaccine and conspiracy theory spaces have raised in the past: Networked misinformation, including misinformation about Covid-19 vaccines, doesn’t exist in isolation. It’s interconnected with and fed by different conspiracy theories, omni-conspiracy theories like QAnon, and political movements. Addressing it will take more than takedowns and account bans. Speaking about QAnon a couple of years ago, Renee DiResta, research manager at the Stanford Internet Observatory and an expert in online disinformation, told me that meaningfully addressing the webs of conspiracy-laden misinformation across social media would require “rethinking the entire information ecosystem.”

Although Broniatowski said he did not advocate for any specific policy recommendations to more effectively combat misinformation, he suggested that one possible avenue of addressing Meta’s infrastructure would be to treat Facebook’s architecture more like a building, governed by science and safety-informed codes, as opposed to an open mic night with a code of conduct for performers. “We think about [misinformation] as the content, but we don’t necessarily think about it as the infrastructure or the system as a whole,” he said.

“I do think that you can get together people from the platforms, people from these civil society organizations, people from the various different government entities that are involved in some way with these sorts of harms, with observing these sorts of harms, and have a consensus-building civil discussion regarding what is it that we’re gonna do in order to make this a safe and enjoyable experience,” Broniatowski said.

The researchers noted that their data provides a limited snapshot of this ecosystem, capturing only public spaces on Facebook that have strong affiliations with a set of vaccine-related keywords. Excluded are the many private and hidden groups that form a core gathering space on Facebook for anti-vaccine and alternative medicine followers, along with public-facing pages that have adapted coded language for discussing these topics in order to avoid being flagged by Meta’s moderating systems. And while the study does track links out to other platforms, it does not capture what users are finding once they end up outside of Facebook.

The study covered the 16 months from November 2020 through February 2022, and the fight against misinformation about Covid-19 vaccines has continued to evolve since then. Facebook remains an important gathering space for the believers and promoters of health misinformation, but other platforms like TikTok have become more popular for reaching new audiences. Facebook discarded some of its rules prohibiting Covid-19 misinformation in June in some regions, including the US. And Meta may be trying a new approach on its latest platform: Earlier this week, the Washington Post reported that Meta-owned Threads was intentionally blocking some pandemic-related search terms entirely on the microblogging platform, including “covid-19,” “long covid,” “vaccines,” and “vaccination.”

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.