How to Avoid Extremism on Social Media

Evans, Williams, and other RAND researchers had planned to study online manifestos posted by far-right extremists in the days and hours before acts of violence. But as they started their search, they realized there was no good way to identify sites that provide safe harbor for such content. They changed direction and started working on a ratings system for websites and social media platforms based on how receptive they are to extremist content.

The researchers looked at traffic volume, ownership information, and the presence or lack of advertising. They dug into content policies and awarded extra points to sites that actually enforce them. They added more points if a site had never been shut down by its service providers. They deducted points for swastikas or other extremist symbols.

In the end, the sites with the most points—think Facebook or Twitter—landed in a category the researchers called “mainstream.” That didn’t mean they were free of extremist content; far from it. But that content wasn’t their main reason for being. At the other extreme were “niche” sites like Stormfront or 8chan, for which it was.

But then there were the sites in the middle. The researchers called them “fringe.” They hosted a mix of extremist and non-extremist content, often under the banner of protecting free speech and standing up to what they describe as censorship on the mainstream platforms. Some, like Gab, are designed to look almost exactly like a mainstream site, down to the fonts they use.

“People sometimes fall into extremist material on these sites; they don’t understand what it is because it’s coded or hides its violent intent behind humor or memes,” said Williams, a senior policy researcher at RAND. “We wanted to give individuals and communities a better tool to help them appreciate when they could be interacting with extremist content.”

The researchers used their scoring system to identify dozens of sites that could host all manner of extremists: anti-government militia members, neo-Nazis, White supremacists. They also included incels—viciously misogynistic “involuntary celibates” who blame women for their inability to find a partner, and who sometimes get overlooked as ideological extremists. On some sites, the researchers found content that was so disturbing, they decided it was probably criminal.

Companies that host social media sites could use RAND’s scorecard as a checklist to strengthen their defenses against extremist content, if they wanted to. Advertisers and other service providers could also use it to decide which sites they want to do business with and which they want to avoid.

The scorecard also gives everyday users a way to anticipate what kind of content they might find on an unfamiliar website—especially on a “fringe” website, where that might not be obvious. In that, it supports one of the key pillars of the nation’s strategy to combat domestic terrorism: making people more careful and skeptical of the content they find online.

“This isn’t an impossible problem,” Evans said. “We know there are things sites can do to make it more difficult for these groups to find each other or to organize or to attract large audiences. But consumers also need to become more informed about what they are consuming online. Maybe this is a way for individuals to think about what they expect and what they can petition companies to do.”

Acacia Dietz knows how slippery the slope can be. She was following news of social justice protests several years ago when she stumbled on a site with a seemingly simple premise: Nobody should feel guilty about their heritage. It was her door into the American neo-Nazi movement.

She got out in 2019, having watched in horror as a gunman who espoused the same White supremacist beliefs stormed mosques in Christchurch, New Zealand, and murdered 51 people. She works now as the managing director of Beyond Barriers, a group that works to prevent people from joining extremist movements and helps them deradicalize when they do. As part of that work, she still monitors social media sites to see what extremists are talking about—and with whom. She sees teenagers as young as 15 in some of those chat rooms.

“It looks pretty innocent. It’s not until you actually get in there and start talking to people that you realize, wait a minute, this is not what it looks like,” she said. “It’s very easy for individuals who are just curious, just looking, to get sucked in. That’s a lot more common than what most people would want to admit.”

Doug Irving is a communications analyst at RAND. This article is published courtesy of RAND.