In October, the organization launched a similar project to peel back the curtain on anti-vaccine activism online. With funding from the New York State Health Foundation, PGP purchases data from Zignal Labs—petabytes of public posts on Facebook, Twitter, YouTube, blogs, web forums, and other websites. PGP’s team of social network analysts parse the data and monitor it using a dashboard that identifies power users, conversational themes, and emerging misinformation hot spots.
PGP had been in talks with BIO before the pandemic hit about leveraging this tool, which they call VECTR, for a national pro-vaccine campaign. But the uptick in vaccine opposition driven by coronavirus conspiracies lent the effort a new urgency. In a press conference last month, Smyser said that social media messaging urging Americans to reject immunizations has tripled since the pandemic began—from 5,400 mentions per day to 14,400. VECTR revealed that this spike has been driven by increasing connections between a core group of about 200 leaders of the anti-vaccine movement—mostly concerned parents—and conspiracy-peddling cults like QAnon, as well as the conservative operators behind anti-lockdown protests. “All of a sudden we’re seeing a lot of cause-stacking going on,” says Smyser.
To combat this maelstrom of misinformation, PGP plans to recruit people who think vaccines are a vital public good and who have a track record of activism—doing things like signing petitions or attending demonstrations. Those people will sign up to receive information about how to combat vaccine-related misinformation when they see it. When a particular piece of misinformation is either about to be seen by millions of people, or has just reached that point, alerts will go out from PGP telling these volunteers what to look for and recommending language for counter-posting. Their idea is to create an army of debunkers who can quickly be mobilized into action. Having enough people is paramount; there’s safety in numbers when posting on a topic that’s likely to spawn a vigorous online battle.
A common tactic of vaccine opponents is to “swarm” people who speak up for immunization programs, such as public officials, physicians, or scientists. During a swarm, a few power users will send legions of followers to attack that person on different social media platforms, bombarding their profiles with an avalanche of consistent ant-vaccine messaging. If that doesn’t do the trick, intimidation campaigns can be escalated—editing Wikipedia pages into smear screeds, voting doctors’ reviews on Google and other sites down into oblivion, doxxing doctors and scientists, and even issuing death threats. Smyser’s idea is for Stronger followers to “show some love” to individuals who come under attack, flooding their pages and social media feeds with evidence-based vaccine information, writing positive reviews, and reporting abusive users.
Similar tactics have been taken up by a group of doctors, nurses, and other health care workers, called Shots Heard Round the World, which The New York Times profiled in March. But Smyser wants to go even bigger, recruiting thousands of everyday people who have some free time now and then to join the digital scrum. “In the anti-vaccine world we see about 129 people generating talking points for hundreds of thousands of people,” says Smyser. “These people call their legislators. They call up public health officials to harass them. We want to train up an army to use the same kinds of tools, but with facts and science on their side.”
But will it actually work? That’s where it’ll come down to the specifics.
It’s taken a while for researchers to come to consensus on how effective debunking actually is. In large part that’s because of a phenomenon called the “backfire effect” which is the notion that attempts to correct a misperception just cause people to dig in their heels and become more entrenched in their beliefs. This term gained notoriety from an influential 2010 study called “When Corrections Fail,” led by University of Michigan public health policy researcher Brendan Nyhan. More recent studies have called this finding into question, and Nyhan himself has written that the backfire effect is probably quite rare and that, in general, debunking “can make people’s beliefs in specific claims more accurate.”