• SOCIAL MEDIASocial Media’: The Changing Tech of Terror

    By Adil Rasheed

    In the wake of the white noise generated by mainstream social media channels and apps, a new trend of ‘anti-social media’ has emerged in recent years, which seeks to abandon mainstream platforms, reduce screen time, and seek private, intimate, or even ‘analogue’ communication to avoid algorithm-driven polarization, surveillance and loneliness. But some of these so-called anti-social media platforms have also become off-the-wall mediums for disseminating extremist propaganda.

  • SOCIAL MEDIAYour Social Media Feed Is Built to Agree with You. What If It Didn’t?

    By Luke Auburn

    The feedback loop is an essential component of the architecture of the social media echo chamber: a space where familiar ideas are amplified, dissenting voices fade, and beliefs can harden rather than evolve. A new study points to algorithm design as a potential way to reduce echo chambers—and polarization—online.

  • TRUTH DECAYWhy Are Older Adults More Likely to Share Misinformation Online?

    By Sy Boles

    Older adults tend to do well at identifying falsehoods in experiments, but they’re also likelier than younger adults to like and share misinformation online. Older adults have greater tendency to seek out, believe material that conforms to pre-existing views, expert says.

  • TRUTH DECAYEmpowering Users to Discern Fact from Fiction in the Age of AI

    By Emma Foehringer Merchant

    A new project will investigate interventions that enable individuals to effectively harness AI while building the literacy needed to avoid scams and other forms of abuse.

  • EXTREMISMFar-Right Extremists Have Been Organizing Online Since Before the Internet – and AI Is Their Next Frontier

    By Michelle Lynn Kahn

    How can society police the global spread of online far-right extremism while still protecting free speech? Far-right extremists have long pioneered innovative ways to exploit technological progress and free speech. Efforts to counter this radicalization are challenged to stay one step ahead of the far right’s technological advances.

  • DEMOCRACY WATCHFake survey Answers from AI Could Quietly Sway Election Predictions

    Public opinion polls and other surveys rely on data to understand human behavior. New research reveals that artificial intelligence can now corrupt public opinion surveys at scale—passing every quality check, mimicking real humans, and manipulating results without leaving a trace.

  • EXTREMISMPolitical Violence Offers Extremist “Trigger Events” for Recruiting Supporters

    Extremists are exploiting political violence by using online platforms to recruit new people to their causes and amplify the use of violence for political goals. High-profile incidents of political violence are useful trigger events for justifying extremist ideologies and calls for retaliation.

  • CHINA WATCHThe American TikTok Deal Doesn’t Address the Platform’s Potential for Manipulation, Only Who Profits

    By Andrew Buzzell

    If we want to protect democratic information systems, we need to focus on reducing the vulnerabilities in our relationship with media platforms – platforms with surveillance power to know what we will like, the algorithmic power to curate our information diet and control of platform incentives, and rules and features that affect who gains influence. The biggest challenge is to make platforms less riggable, and thus less weaponizable, if only for the reason that motivated the TikTok ban: we don’t want our adversaries, foreign or domestic, to have power over us.

  • POLARIZATIONInfluencers, Multipliers, and the Structure of Polarization: How Political Narratives Circulate on Twitter/X

    A recent study provides a nuanced understanding of the mechanisms driving polarization and issue alignment on Twitter/X and reveals how political polarization is reinforced and structured by two distinct types of highly active users: influencers and multipliers.

  • EXTREMISMHashtags and Humor Are Used to Spread Extreme Content on Social Media

    Conspiracy theories and incitement to harassment and violence abound on mainstream social media platforms like Facebook and Instagram. But the extreme content is often mixed with ironic play, memes and hashtags, which makes it difficult for authorities and media to know how to respond.

  • DEEPFAKESAustralia’s Deepfake Dilemma and the Danish Solution

    By Andrew Horton and Elizabeth Lawler

    Countries need to move beyond simply pleading with internet platforms for better content moderation and instead implement new legal frameworks that empower citizens directly. For a model of how to achieve this, policymakers should look to the innovative legal thinking emerging from Denmark.

  • EXTREMISMWhat Does Netflix’s Drama “Adolescence” Tell Us About Incels and the Manosphere?

    By Lewys Brace

    While Netflix’s psychological crime drama ‘Adolescence’ is a work of fiction, its themes offer insight into the very real and troubling rise of the incel and manosphere culture online.

  • TRUTH DECAYAI System Identifies Fake Videos Beyond Face Swaps and Altered Speech

    By David Danelski

    In an era where manipulated videos can spread disinformation, bully people, and incite harm, UC Riverside, in collaboration with Google, have developed a new model which spots fakes by interpreting faces and backgrounds.

  • EXTREMISMGrok’s Antisemitic Rant Shows How Generative AI Can Be Weaponized

    By James Foulds, Phil Feldman, and Shimei Pan

    The AI chatbot Grok went on an antisemitic rant on July 8, 2025, posting memes, tropes and conspiracy theories used to denigrate Jewish people on the X platform. It also invoked Hitler in a favorable context. The episode follows one on May 14, 2025, when the chatbot spread debunked conspiracy theories about “white genocide” in South Africa, echoing views publicly voiced by Elon Musk, the founder of its parent company, xAI.

  • EXTREMISMTerrorgram Block Is a Welcome Step Towards Countering Violent Extremism

    By Henry Campbell

    Terrorgram has been linked to lone-actor attacks in Slovakia, Turkey, Brazil and the United States. Its listing places it among the likes of Hamas, Islamic State, and violent white supremacist groups such as Sonnenkrieg Division and The Base.