Nearly two months into the largest vaccine rollout in U.S. history, Instagram continued to prominently feature anti-vaccination accounts in its search results, while Facebook groups railing against vaccines remained easy to find.
Facebook has with addressing anti-vaxxer content. Late last year, it established new rules to tackle COVID-19 vaccine misinformation after pledging two years ago to reduce the spread of anti-vaxxer content. But misleading and fearmongering content about the COVID-19 vaccines, as well as outright misinformation, continues to spread on the platform at a time when the stakes couldn't be higher: misinformation about the vaccine can mean life or death.
Four of the top 10 search results for "vaccine" on Facebook-owned Instagram were for anti-vaccination accounts, including "vaccinetruth," "vaccinefreedom," "antivaxxknowthefacts," and "cv19vaccinereactions," according to a series of searches conducted by CNN Business from multiple different Instagram handles beginning two weeks ago.
Shortly after, Instagram updated its search interface on mobile devices to showcase three credible results, including the CDC's account, followed by a "See More Results" prompt. Users who click on that option are then shown a number of anti-vaccination accounts, in what is arguably the digital equivalent to shoving the mess in a bedroom under the bed.
Some of those accounts have amassed sizable followings, raising the question of whether Instagram suggesting them as a top result for users simply seeking out vaccine information helped them grow an audience. The "cv19vaccinereactions" account, which is devoted to documenting claims of adverse reactions to the vaccine, boasts more than 77,000 followers. The account often shares unsubstantiated reports and insinuates unproven links between people getting the Covid-19 vaccine and major health events, including a stroke or a miscarriage.
The fact that some of this anti-vaxx content continues to hide in plain sight on the platforms highlights a controversial distinction in Facebook's approach: A company spokesperson says Facebook distinguishes between vaccine misinformation specifically, which it does crack down on, and posts that express a more general anti-vaccine sentiment, which it allows on the platform.
In December, Facebook it would remove claims about coronavirus vaccines that have been debunked by public health officials, including baseless conspiracy theories that they contain microchips. Previously, Facebook's policies banned misinformation about Covid-19 that "contributes to the risk of imminent violence or physical harm."
Public health experts have said they fear misinformation about COVID-19 vaccines and anti-vaccination content generally on social media could lead to people declining to get the shot. "If they're scared away by falsehoods perpetuated through social media, we'll have a real problem of getting out of this pandemic," said Dr. L.J Tan, chief strategy officer of the Immunization Action Coalition (IAC).
Joe Osborne, a Facebook spokesperson, said the company has been working to "reduce the number of people who see false information" about vaccines and it's trying to do "more to address other misleading vaccine content that falls outside of these policies."
Osborne added that the company removes claims about the COVID-19 vaccine that have been debunked by public health experts and adds labels and reduces the distribution of other misinformation determined to be false by its third-party fact-checking partners.
When a measles outbreak swept across the U.S. nearly two years ago, Facebook vaccine misinformation by limiting such content's reach on its platforms, but stopped short of banning it completely. In March 2019, Facebook said it would "reduce the ranking of groups and Pages that spread misinformation about vaccinations" by not including them in recommendations or in predictions when users type into the search bar. But two months later, Instagram was still serving up posts from anti-vaccination accounts and anti-vaccination hashtags to anyone searching for the word "vaccines."
While Facebook dedicated to anti-vaccine content in November 2020, CNN Business found that more than 20 anti-vaxxer groups remain on the platform, with membership ranging from a few hundred to tens of thousands of users. (The company said the group it removed in November was flagged for violating its policies on recidivism -- which stops group administrators from creating another group similar to the one the company removed -- as well as violating its policies against the .)
When searching for the word "vaccine" on Facebook's groups feature last week, three of the top 20 results surfaced by the platform led to groups promoting anti-vaccine content, including groups called "Say No COVID-19 Vaccine," "COVID-19 Vaccine Injury Stories" and "Vaccine Talk: A Forum for Both Pro and Anti Vaxxers" -- which has more than 50,000 members. The list fluctuates. A few days later, none of these groups were in the top 20, but results 18 through 20 pointed to groups discussing side effects of the vaccine or adverse reactions. Scrolling down further, it was easy to find other anti-vaxxer groups in the search results, including one titled "Unvaccinated and Thriving," which makes widely and consistently debunked claims in its description linking vaccines with autism and other disorders and diseases. It's unclear what powers Facebook's search recommendations and why the results change day to day. Facebook did not offer a clear explanation after repeated requests for comment.
Dr. Wafaa El-Sadr, a professor of epidemiology and medicine at Columbia University's Mailman School of Public Health, called vaccine misinformation on social media "very dangerous" and said it could have "dire consequences."
"We are in a race with the virus," she said. "We need everyone who's eligible for the vaccines to get vaccinated as soon as possible."
One public Facebook group, which has more than 58,000 members, is devoted to posts about supposed "vaccine injuries and reactions." Several recent posts on the group's page include links that have been marked as "false information" by Facebook's independent fact checkers or have a label saying "Missing Context. Independent fact-checkers say this information could mislead people." One link that was shared -- and by independent fact-checkers -- claimed that 53 people died in Gibraltar because of the COVID-19 vaccine. Despite the warning labels, members of the group continue to engage with these links, express their doubts about Facebook's fact checkers and share unsubstantiated stories or theories about vaccines being dangerous.
"A story doesn't have to be accurate to change minds. That's what we're fighting against right now," IAC's Tan said. "In the age of the internet, science is not the most compelling story."
Columbia's El-Sadr cautioned people to be wary of anecdotes or individual stories they read in such Facebook groups -- which may or may not be true or have any link to the vaccine.
"The vast majority of people thus far have had completely uneventful vaccinations," she said. "We need to keep reminding people of this. These vaccines have had a very safe profile and are incredibly effective."