Conspiracy theoryQAnon Conspiracies on Facebook Could Prompt Real-World Violence

Published 20 August 2020

As Facebook continues to grapple with hate speech and violent extremism across the platform, QAnon conspiracy theorists are using public and private Facebook pages and groups to spread disinformation, racism, and thinly veiled incitement to violence. This conspiracy is estimated to have a Facebook audience of millions of users.

As Facebook continues to grapple with hate speech and violent extremism across the platform, QAnon conspiracy theorists are using public and private Facebook pages and groups to spread disinformation, racism, and thinly veiled incitement to violence.

QAnon, the wide-reaching conspiracy theory popular among a range of right-wing extremists and supporters of President Trump, follows an anonymous personality “Q.” Followers of Q believe a pedophiliac global elite—referred to as “The Deep State” or “The Cabal”—control world governments, the banking system, the Catholic Church, the agricultural and pharmaceutical industries, the media and the entertainment industry in order to keep people poor, ignorant and enslaved. According to the theory, President Trump will bring about justice by ousting this global elite.

This conspiracy is estimated to have a Facebook audience of millions of users. While ADL does not believe that all QAnon adherents are inherently extremists, the public proliferation of these conspiracy theories is dangerous. To date, QAnon followers have been linked to multiple instances of real-world criminality in the name of the conspiracy, including murdervandalismarsonkidnappingterrorism, and assault with a dangerous weapon (firearms). In a 2019 bulletin that warned of the danger of conspiracy theories like QAnon, the FBI wrote, “it is logical to assume that more extremist-minded individuals will be exposed to potentially harmful conspiracy theories, accept ones that are favorable to their views, and possibly carry out criminal or violent actions.”

It is the FBI’s assessment that “significant efforts by major social media companies and websites to remove, regulate, or counter potentially harmful conspiratorial content” might change these theories’ potentially harmful impact. Earlier this year, Facebook removed a small fraction of QAnon groups and users, calling it part of a “coordinated inauthentic behavior” campaign in advance of the 2020 elections. However, these efforts appear to have focused on groups and individuals who deliberately misrepresented their identity, rather than targeting the potentially dangerous disinformation these users spread. In early August 2020, Facebook announced it removed another large QAnon public group, citing “bullying and harassment, hate speech, and false information that could lead to harm.”