Misinformation Really Does Spread Like a Virus, Suggest Mathematical Models Drawn from Epidemiology

Looking for Solutions
Mathematical modelling typically either involves what’s called phenomenological research (where researchers describe observed patterns) or mechanistic work (which involves making predictions based on known relationships). These models are especially useful because they allow us to explore how possible interventions may help reduce the spread of misinformation on social networks.

We can illustrate this basic process with a simple illustrative model shown in the graph below, which allows us to explore how a system might evolve under a variety of hypothetical assumptions, which can then be verified.

Prominent social media figures with large followings can become “superspreaders” of election disinformation, blasting falsehoods to potentially hundreds of millions of people. This reflects the current situation where election officials report being outmatched in their attempts to fact-check misinformation.

In our model, if we conservatively assume that people just have a 10% chance of infection after exposure, debunking misinformation only has a small effect, according to studies. Under the 10% chance of infection scenario, the population infected by election misinformation grows rapidly (orange line, left panel).

Psychological ‘Vaccination’
The viral spread analogy for misinformation is fitting precisely because it allows scientists to simulate ways to counter its spread. These interventions include an approach called “psychological inoculation”, also known as prebunking.

This is where researchers preemptively introduce, and then refute, a falsehood so that people gain future immunity to misinformation. It’s similar to vaccination, where people are introduced to a (weakened) dose of the virus to prime their immune systems to future exposure.

For example, a recent study used AI chatbots to come up with prebunks against common election fraud myths. This involved warning people in advance that political actors might manipulate their opinion with sensational stories, such as the false claim that “massive overnight vote dumps are flipping the election”, along with key tips on how to spot such misleading rumors. These ‘inoculations’ can be integrated into population models of the spread of misinformation.

You can see in our graph that if prebunking is not employed, it takes much longer for people to build up immunity to misinformation (left panel, orange line). The right panel illustrates how, if prebunking is deployed at scale, it can contain the number of people who are disinformed (orange line).

The point of these models is not to make the problem sound scary or suggest that people are gullible disease vectors. But there is clear evidence that some fake news stories do spread like a simple contagion, infecting users immediately.

Meanwhile, other stories behave more like a complex contagion, where people require repeated exposure to misleading sources of information before they become “infected”.

The fact that individual susceptibility to misinformation can vary does not detract from the usefulness of approaches drawn from epidemiology. For example, the models can be adjusted depending on how hard or difficult it is for misinformation to “infect” different sub-populations.

Although thinking of people in this way might be psychologically uncomfortable for some, most misinformation is diffused by small numbers of influential superspreaders, just as happens with viruses.

Taking an epidemiological approach to the study of fake news allows us to predict its spread and model the effectiveness of interventions such as prebunking.

Some recent work validated the viral approach using social media dynamics from the 2020 US presidential election. The study found that a combination of interventions can be effective in reducing the spread of misinformation.

Models are never perfect. But if we want to stop the spread of misinformation, we need to understand it in order to effectively counter its societal harms.

Sander van der Linden is Professor of Social Psychology in Society, University of Cambridge. David Robert Grimes is Assistant Professor of Biostatistics, Public Health & Primary Care, Trinity College Dublin. This article is published courtesy of The Conversation.