What do you think? Leave a respectful comment.

How ‘prebunking’ can fight fast-moving vaccine lies

Vaccines have played a vital role in slowing rates of new infection, hospitalization and death due to COVID-19. But misinformation threatens our progress on the pandemic by undermining confidence in the vaccines and science behind them.

Fact-checking and debunking can combat false news spread on social media, but they are simply not enough. Correcting falsities demands a relatively small number of experts and researchers to tackle a mountain of misinformation. And once a lie is out on the internet, it’s often impossible to contain.

WATCH: States pull out all the stops in push to increase vaccinations

“Sowing doubt is much easier than resolving doubt,” said political scientist Adam Berinsky, who directs the Political Experiments Research Lab at Massachusetts Institute of Technology.

Rather than chasing lies and trying to persuade people who have already been convinced by them, researchers and public health officials are exploring ways to get ahead of those lies through a social psychology tool called prebunking. During the 2020 presidential election, Twitter even gave it a shot.

Social scientists are testing how prebunking can be used to keep people from believing misinformation, but questions remain about how effective this tool can be and whether it could have widespread reach.

What is prebunking?

The idea of prebunking can be traced to social psychologist William McGuire’s research during the 1960s. Inspired by how vaccines teach the body to fend off disease, McGuire argued that attitude inoculation, also called the theory of psychological immunization, could build up a person’s resistance to a persuasive but flawed idea by exposing them to weaker versions of that same idea that they learn to disarm.

More than five decades later, another social psychologist named Sander van der Linden stumbled across McGuire’s research in a library while exploring ways to snuff out misinformation about climate change. For years, recycled myths easily spread through social and mainstream media, fueling climate denial, said van der Linden, who directs the Cambridge Social Decision-Making Laboratory at the University of Cambridge in the United Kingdom.

Often, these ideas returned despite being disproved, van der Linden said, because they bore one or more traits that help lies gain traction: polarizing messages designed to pit people against each other, fake experts who amplified a false message, emotion and conspiracy theories.

The internet’s rise has only enhanced the ways that misinformation can spread like a viral pathogen, van der Linden said. Indeed, research suggests false news spreads six times faster than the truth. And that false information enjoys a competitive advantage — it’s easier to write and designed for sensationalism and maximum sharing. Fact-checkers must find evidence to disprove the flawed claim, then write down and share their arguments in articles, on social media and through television and radio appearances. But those who encountered the original false information may never see it debunked, thus never knowing the truth. The work never ends.

Rather than squander limited resources on endless rounds of whack-a-mole, van der Linden said McGuire’s ideas about neutralizing bad information before it spreads “seemed even more relevant now,” in the digital age.

Joined by Jon Roozenbeek, a social psychologist at the University of Cambridge, van der Linden plotted out games to prime people’s ability to spot misinformation by offering a glimpse into what motivates someone to spread falsehoods in the first place. In a web-based game called Bad News, a player pretends to be a “disinformation and fake news tycoon,” who then is invited to smear the government or mainstream media on Twitter. Each tweet earns more followers and grows influence.

The game coaches a player to build their “fake news empire” by creating a fake official Twitter account for themselves, impersonating a famous person, and launching a news site or blog — all of which are tactics used by real-life motivated individuals to misinform people.

Money and power often drives these bad actors, van der Linden said. His hope is that the game will make people aware of those motivations, so they may become more cautious when they consume news and be more discerning about which messengers they trust.

According to a 2020 study published through the Harvard Kennedy School’s Shorenstein Center on Media, Politics and Public Policy, the Bad News simulation of fake tweets significantly reduced people’s vulnerability to impersonation, conspiracy and discrediting information.

Vaccinating the mind against vaccine myths

Roughly half of all Americans ages 12 or older — 140.4 million people — are fully vaccinated against the virus, according to the latest data from the Centers for Disease Control and Prevention. President Joe Biden said he wants at least 70 percent of the nation’s population to get vaccinated against the virus by July 4.

As often happens in the United States, polarized politics may obstruct further progress. According to a May 17 PBS NewsHour/NPR/Marist poll, 41 percent of Republicans said they do not plan to get vaccinated against the coronavirus. By comparison, 4 percent of Democrats felt the same way. Overall, 24 percent of Americans said they had no intention of rolling up their sleeves.

Partisan politics are not the only problem. Nine out of 10 adults struggle to understand information about health when it is unfamiliar, complex or riddled with jargon, according to the CDC. The coronavirus pandemic has forced many people to take a real-time crash course on public health and epidemiology, ranging from evolving research about the need for face masks to interpreting case data and vaccine trial results.

Uncertainty about the pandemic has inspired a proliferation of false or misleading information about vaccine safety and efficacy on the internet. People can easily and quickly design and scatter clever messages and funny memes on social media, which gain momentum while spreading doubt about life-saving vaccines. Public health experts at the CDC are exploring prebunking’s potential as one way to reach people before conspiracy theories do.

While games tailored to the coronavirus pandemic remain in early phases, they are being considered as part of a broader strategy to boost people’s overall health literacy, said Neetu Adab, a behavioral scientist on the CDC’s Demand for Immunization Team.

It would not be the first time the federal government has resorted to such methods to get ahead of misinformation. During the 2020 election, van der Linden and Roozenbeek developed a game called Harmony Square for the Department of Homeland Security to demonstrate what motivates people who drive polarization.

“We’re all trying to find the things that work,” Adab said.

To counter false claims about vaccines, the World Health Organization commissioned van der Linden to develop GoViral!, a game built on a premise similar to Bad News but tethered to the coronavirus pandemic. When a player enters the game, they are encouraged to “walk a mile in the shoes of a manipulator to get to know their tactics from the inside” and “see it as ruining the magician’s trick so that you don’t fall for it next time around.” Going down a simulated social media rabbit hole, players learn how filter bubbles create echo chambers of false information and to manipulate negative emotions to stoke outrage and build influence.

In an environment with “good information hygiene,” van der Linden said conspiracy theories and misinformation are less likely to take hold in people’s imagination and people are more likely to make informed decisions to protect their public health.

“We want to achieve some sort of herd immunity where enough people are psychologically vaccinated that misinformation won’t have a chance to spread and infect people,” he said.

But the game’s effect may taper over time. Van der Linden pointed to research suggesting that people begin to let their guard down on evaluating social media posts a few weeks after playing the game if they haven’t played again or otherwise received a cognitive booster shot, such as being reminded about the need to watch out for misleading information.

Filling the misinformation toolbox

Critics warn that prebunking may — by introducing questions about, for example, vaccine safety and climate change — sow the very ideas it is supposed to uproot. Van der Linden acknowledges the validity of that concern but counters that current options to fight misinformation, such as fact-checking and debunking, are not risk-free, either. Research suggests that misinformed people are the least likely to benefit from fact checking (especially when they are confident that they are right), and that trying to prove a belief wrong can invite some people to become more deeply entrenched in their side of an issue. Fact-checking has become politicized, Roozenbeek said.

Plus, those efforts require far more time than it takes for a person to crank out disinformation about vaccine safety. While the race to protect the truth can feel like a losing one, the pandemic has shown the costs to sitting idle and letting lies go unchecked. Social scientists, like van der Linden and Roozenbeek, say new ideas about how to guard truth in the public interest are still worth exploring and testing.

“If we didn’t prebunk, what would we do?” he asked.

Prebunking through GoViral! allows players to see the world through the eyes of misinformers, but does not go so far as converting them into bad actors, van der Linden said. With that newfound awareness, van der Linden and Roozenbeek hope prebunking can prevent people from being taken in by lies in the first place because they might recognize messages that attempt to manipulate the public.

But the game simply won’t reach everyone. Out of billions of people who live in the world, van der Linden said roughly 1 million have played Bad News. Roozenbeek said it is important for researchers to understand the limited reach of games and continue to look for better ways to lend truth the upperhand.

READ MORE: As more Americans get vaccinated, 41% of Republicans still refuse COVID-19 shots

“Not everyone is going to sit down and play this game,” Roozenbeek said. “The vast majority are not.”

Berinsky thinks this approach can join a toolbox of strategies to blunt the effects of bad information. But Berinsky also cautions against zealous calls for people to adopt greater skepticism before believing what they read, see or hear. An overly skeptical person can end up believing in nothing, which can be as problematic as someone who believes everything, he said.

Twitter tried a one-two punch of prebunking and debunking during the 2020 election when then-President Donald Trump repeatedly spread misinformation about voting by mail and said the election was stolen from him, without any significant evidence supporting his claims. To counter Trump and prevent people from losing faith in the election’s integrity, Twitter flagged misleading tweets and offered links to factual resources so people could learn more about mail-in voting. The social network also “got ahead of potentially misleading information by showing everyone on Twitter in the U.S. a series of pre-bunk prompts.” Placed atop users’ home timelines and in search, the prompts stated the security and legitimacy of voting by mail.

But Trump’s falsehoods outlasted his time in the White House. Months after Biden won the election, 31 percent of Americans (including 70 percent of Republicans) still did not believe he was legitimately elected, according to a Jan. 19 PBS NewsHour/NPR/Marist poll. Prebunking presents limited potential to thwart that kind of doubt, Berinsky said.

These measures must go deeper, said David Rand, associate professor of management science and brain and cognitive sciences at Massachusetts Institute of Technology. The same thing that makes false claims go viral — sensationalism — often serves as a clue that a message may not be true, he said.

“Put everything you read through a smell test,” Rand said. But speed, distraction and emotions — all key elements of information consumption on social media — can obscure a person’s ability to sniff out misinformation and leave them “more inclined to believe falsehoods when they encounter them.”

Time investment and interest may pose problematic hurdles for prebunking, Rand said. He thinks more can be done with using trusted messengers, such as wielding the influence of a prominent Republican, to promote vaccines and dispel misinformation before it reaches an intended audience.

As with the pandemic, time is an enemy in the fight against misinformation, van der Linden said: “The longer that misinformation sits in people’s brains, the more entangled it becomes.”