RADICALIZATIONHow People Get Sucked into Misinformation Rabbit Holes – and How to Get Them Out

By Emily Booth and Marian-Andrei Rizoiu

Published 26 February 2024

As misinformation and radicalization rise, it’s tempting to look for something to blame: the internet, social media personalities, sensationalized political campaigns, religion, or conspiracy theories. Researchers found that radicalization is a process of increasingly intense stages, and only a small number of people progress to the point where they commit violent acts. The misinformation radicalization process is a pathway driven by human emotions rather than the information itself – and this understanding may be a first step in finding solutions.

As misinformation and radicalization rise, it’s tempting to look for something to blame: the internet, social media personalities, sensationalized political campaigns, religion, or conspiracy theories. And once we’ve settled on a cause, solutions usually follow: do more fact-checking, regulate advertising, ban YouTubers deemed to have “gone too far”.

However, if these strategies were the whole answer, we should already be seeing a decrease in people being drawn into fringe communities and beliefs, and less misinformation in the online environment. We’re not.

In new research published in the Journal of Sociology, we and our colleagues found radicalization is a process of increasingly intense stages, and only a small number of people progress to the point where they commit violent acts.

Our work shows the misinformation radicalization process is a pathway driven by human emotions rather than the information itself – and this understanding may be a first step in finding solutions.

A Feeling of Control
We analyzed dozens of public statements from newspapers and online in which former radicalized people described their experiences. We identified different levels of intensity in misinformation and its online communities, associated with common recurring behaviors.

In the early stages, we found people either encountered misinformation about an anxiety-inducing topic through algorithms or friends, or they went looking for an explanation for something that gave them a “bad feeling”.

Regardless, they often reported finding the same things: a new sense of certainty, a new community they could talk to, and feeling they had regained some control of their lives.

Once people reached the middle stages of our proposed radicalization pathway, we considered them to be invested in the new community, its goals, and its values.

Growing Intensity
It was during these more intense stages that people began to report more negative impacts on their own lives. This could include the loss of friends and family, health issues caused by too much time spent on screens and too little sleep, and feelings of stress and paranoia. To soothe these pains, they turned again to their fringe communities for support.

Most people in our dataset didn’t progress past these middle stages. However, their continued activity in these spaces kept the misinformation ecosystem alive.

When people did move further and reach the extreme final stages in our model, they were doing active harm.