• Capitol Riot Exposed QAnon’s Violent Potential

    Many followers of the QAnon conspiracy theory see themselves as digital warriors battling an imaginary cabal of Satan-worshipping pedophiles who rule the world from the convenience of their keyboards. But the January 6 U.S. Capitol riot by supporters of former President Donald Trump exposed the potential for violence in a movement that reared its head on the fringes of the internet in 2018 and now boasts millions of adherents around the world.

  • An AI-Based Counter-Disinformation Framework

    There are different roles that AI can play in counter-disinformation efforts, but the current shortfalls of AI-based counter-disinformation tools must be addressed first. Such an effort faces technical, governance, and regulatory barriers, but there are ways these obstacles could be effectively addressed to allow AI-based solutions to play a bigger role in countering disinformation.

  • Many QAnon Followers Report Having Mental Health Diagnoses

    QAnon followers, who may number in the millions, are often viewed as a group associated with baseless and debunked conspiracy, terrorism, and radical action, such as the 6 January Capitol insurrection. But radical extremism and terror may not be the real concern from this group. A social psychologist who studies terrorists, and a security scholar, in their research for their forthcoming book — Pastels and Pedophiles: Inside the Mind of QAnon — noticed that QAnon followers are different from the radicals they usually study in one key way: They are far more likely to have serious mental illnesses.

  • A Dozen Experts with Questions Congress Should Ask the Tech CEOs — On Disinformation and Extremism

    On Thursday, 25 March, two subcomittees of the House Energy & Commerce Committee will hold a joint hearing on “the misinformation and disinformation plaguing online platforms. Yaël Eisenstat and Justin Hendrix write that Thursday hearings will be the first time the tech CEOs will face Congress since the January 6th siege on the U.S. Capitol, where different groups of individuals sought to prevent the certification of the presidential election because they were led by Donald Trump to believe in the lie that the election was stolen. “Should social media companies continue their pattern of negligence, governments must use every power – including new legislation, fines and criminal prosecutions – to stop the harms being created,” says one expert. “Lies cost lives.”

  • Fake News: People with Greater Emotional Intelligence Are Better at Spotting Misinformation

    The spread of misinformation – in the form of unsubstantiated rumor and intentionally deceitful propaganda – is nothing new. However, the global proliferation of social media, the 24-hour news cycle and consumers’ ravenous desire for news – immediately and in bite-size chunks – means that today, misinformation is more abundant and accessible than ever. But our new study shows fake news doesn’t affect everyone equally. People with greater emotional intelligence are better at spotting it.

  • A Remedy for the Spread of False News?

    Stopping the spread of political misinformation on social media may seem like an impossible task. But a new study co-authored by MIT scholars finds that most people who share false news stories online do so unintentionally, and that their sharing habits can be modified through reminders about accuracy.

  • Russia, Iran Meddled in November's Election; China Did Not: U.S. Intelligence

    A just-released assessment by U.S. intelligence officials finds Russia and Iran did seek to influence the outcome of the November 2020 presidential election. But the assessment also concludes that, despite repeated warnings by a number of top Trump officials, China ultimately decided to sit it out. In the run-up to the November election, President Donald Trump, DNI John Ratcliffe, NSC Adviser Robert O’Brien, and AG William Barr. Among other Trump supporters, argued the Chinese interference in the election posed as much of a threat to the election as Russian interference, with Barr arguing that China posed an even greater threat. The intelligence community’s unanimous conclusions that “China did not deploy interference efforts and considered but did not deploy influence efforts intended to change the outcome of the U.S. Presidential election,” will likely lead to new questions about how the intelligence was presented to the public.

  • Are Telegram and Signal Havens for Right-Wing Extremists?

    Since the violent storming of Capitol Hill and subsequent ban of former U.S. President Donald Trump from Facebook and Twitter, the removal of Parler from Amazon’s servers, and the de-platforming of incendiary right-wing content, messaging services Telegram and Signal have seen a deluge of new users. Steven Feldstein and Sarah Gordon write that the two services rely on encryption to protect the privacy of user communication, which has made them popular with protesters seeking to conceal their identities against repressive governments in places like Belarus, Hong Kong, and Iran. “But the same encryption technology has also made them a favored communication tool for criminals and terrorist groups, including al Qaeda and the Islamic State.” Telegram has purged Islamic State from the platform, and it could the same with far-right violent extremists.

  • After the Insurrection, America’s Far-Right Groups Get More Extreme

    As the U.S. grapples with domestic extremism in the wake of the Jan. 6 insurrection at the U.S. Capitol, warnings about more violence are coming from domestic intelligence and law enforcement agencies. Two experts – the authors of a recent book on extremist violence in the United States – say that some members have left extremist groups in the wake of the Jan. 6 violence. But the members who remain, and the new members they are attracting, are increasing the radicalization of far-right groups.

  • Spotting Deepfakes by Looking at Light Reflection in the Eyes

    Computer scientists have developed a tool that automatically identifies deepfake photos by analyzing light reflections in the eyes. The tool proved 94 percent effective with portrait-like photos in experiments.

  • The Infrastructure of Hate: Epik Hosts Extremist Groups

    Social media platforms have received the lion’s share of attention for enabling users to spread hate and disinformation and plan and incite violence and terrorist acts. Flying under the radar are infrastructure providers like Epik, a domain registrar and web hosting company that works with nearly 750,000 websites and is ranked among the 50 largest web hosts. While some companies at the infrastructure level have acknowledged a level of responsibility for addressing abuse of their services—for example, this framework by domain registrars signed by leading companies such as GoDaddy, Tucows and Amazon—Epik is not among them.

  • How Shared Partisanship Leads to Social Media Connections

    It is no secret that U.S. politics is polarized. An experiment conducted by MIT researchers now shows just how deeply political partisanship directly influences people’s behavior within online social networks. The Twitter experiment shows clear self-selection into social media “echo chambers” due to political preferences.

  • Facebook Restores News to Australian Users

    Facebook is restoring news content to its users in Australia after resolving a dispute with the government. Last week, Facebook blocked Australians from sharing and reading news stories on its platform in a dispute with the government in Canberra.   

  • Inoculating against the Spread of Radical-Islamist and Islamophobic Disinformation

    Misinformation, disinformation, and propaganda are core components of radicalization and extremism and apply equally to Islamist radicalization and the generation of Islamophobia. One method of countering disinformation is to inoculate the information consumer.

  • 46,218 News Transcripts Show Ideologically Extreme Politicians Get More Airtime

    We research how changes in the media have shifted the incentives of elected officials and the considerations of voters, and what that means for American democracy. In recent work, we showed that extremely conservative and extremely liberal legislators receive far more airtime on cable and broadcast news than their moderate counterparts. Robust local news outlets once held legislators to account by covering whether they delivered for their districts. But as local news has declined, voters are turning to national media outlets for their political news. There, ideological outliers now set the tone of the debate, distorting perceptions of the important issues and warping Americans’ views of their political options.