• Russian Info Ops Putting U.S. Police in Their Crosshairs

    Russia appears to be intensifying its focus on police enforcement issues in the United States, using popular reactions to protests that have gripped the nation as part of a larger propaganda campaign to divide Americans ahead of the U.S. presidential election in November. For weeks Russia has used state-controlled RT and Sputnik, and social media posts, to spread disinformation about the protests. Only now, it seems that Russia, through the English-language RT in particular, is reaching out to U.S. police officers and union officials, in what some U.S. officials and lawmakers say is an effort to further inflame tensions.

  • Finding Links between Belief in Conspiracy Theories and Political Engagement

    A belief in the existence of conspiracies — particularly among followers of extremist movements — seems to go hand-in-hand with the assumption that political violence is an acceptable option. However, the role that a belief in conspiracies actually plays in political extremism and the willingness to use physical force has to date been disputed by psychologists.

  • Don’t Blame Social Media for Conspiracy Theories – They Would Still Flourish without It

    COVID-19 conspiracy theories have encouraged people to engage in some dangerous activities in the past few months. There is no simple explanation for why people believe conspiracy theories like these, and the best researchers can say is that the causes of such beliefs are complex and varied. And yet journalists, activists and politicians are increasingly blaming the internet, and social media in particular, for the spread of conspiracy theories. The problem with such accusations is that the evidence paints a more nuanced picture.

  • Helping Users Control Their Personal Data

    The trove of digital data we generate in our daily lives can potentially make us more efficient, increase sustainability and improve our health, among other benefits, but it also poses threats to privacy. To help individuals take greater control of their personal information, researchers have developed and tested a platform, Ancile, that allows users to set restrictions on what kind of data they’ll release, and to whom.

  • Twitter Removes 170,000 Accounts Used by China, Russia, and Turkey to Spread Disinformation

    Twitter said Thursday it had removed more than 170,000  accounts used by China, Russia and Turkey to spread disinformation. The accounts were part of a network used to push propaganda, attack critics of the government, and spread misinformation. A majority of the accounts were linked to China.

  • EU: China, Russia Waging Broad Pandemic Disinformation Campaign to Deepen Crisis

    The European Union, in an unusually blunt language, has accused Russia and China of a running a broad, sustained, and “targeted” disinformation campaign inside the European Union, aiming to deepen and lengthen the coronavirus pandemic crisis and its negative medical, economic, and social effects. The EU has criticized Russia in the past for its sophisticated disinformation campaign aiming to weaken the West and undermine liberal democracies, but the direct criticism of China is a break from the EU recent approach, which saw it tiptoeing around China’s many transgressions.   

  • U.S. Accuses Foreign Actors of Inflaming Tensions over Floyd Killing

    U.S. adversaries are starting to weaponize protests that have gripped parts of the country “to sow divisiveness and discord,” according to top law enforcement officials who refused to share additional details. The U.S. Justice Department and the FBI allege that unnamed countries are actively manipulating information to make the situation in the United States worse.

  • Twitter Suspends Fake Antifa Account Created by White Nationalists to Incite Violence

    Twitter suspended a fake account, created by white nationalist group Identity Evropa, which pretended to be affiliated with Black Lives Matter and incited violence. The account called upon African American participants in the protests to use violence against law enforcement and places of business. “Tonight’s the night, Comrades,” one tweet had said, before encouraging users to “take what’s ours.”

  • Virality Project (US): Marketing Meets Misinformation

    Pseudoscience and government conspiracy theories swirl on social media, though most of them stay largely confined to niche communities. In the case of COVID-19, however, a combination of anger at what some see as overly restrictive government policies, conflicting information about treatments and disease spread, and anxiety about the future has many people searching for facts…and finding misinformation. This dynamic creates an opportunity for determined people and skilled marketers to fill the void - to create content and produce messages designed to be shared widely.

  • White Supremacist Groups Thriving on Facebook

    Dozens of white supremacist groups are operating freely on Facebook, allowing them to spread their message and recruit new members. The findings, more than two years after Facebook hosted an event page for the deadly “Unite the Right” rally in Charlottesville, Virginia, cast doubt on the company’s claims that it’s effectively monitoring and dealing with hate groups. What’s more, Facebook’s algorithms create an echo chamber that reinforces the views of white supremacists and helps them connect with each other.

  • Social Media Platforms Can Contribute to Dehumanizing Other People

    A recent analysis of discourse on Facebook highlights how social media and an individual’s sense of identity can be used to dehumanize entire groups of people. “Fundamentally, we wanted to examine how online platforms can normalize hatred and contribute to dehumanization,” says one researcher. “And we found that an established model of the role identity plays in intractable conflicts seems to explain a great deal of this behavior.”

  • U.S.-Funded Website Spreading COVID Misinformation in Armenia

    U.S. taxpayer money has funded a controversial health news website in Armenia that is spreading “incredibly dangerous” COVID-19 misinformation. Public health experts in the U.S. and Armenia denounced this content – which includes claims that vaccines currently being developed are actually “biological weapons.”

  • Facebook Knew Its Algorithms Promoted Extremist Groups, but Did Nothing: Report

    A Facebook Inc. team had a blunt message for senior executives. The company’s algorithms weren’t bringing people together. They were driving people apart. “Our algorithms exploit the human brain’s attraction to divisiveness,” read a slide from a 2018 presentation. “If left unchecked,” it warned, Facebook would feed users “more and more divisive content in an effort to gain user attention & increase time on platform.” The Wall Street Journal reports that the main reason behind Facebook’s decision to do nothing was the fear that any content moderation measures would disproportionately affect right-wing pages, politicians, and other parts of the user base that drove up engagement. The company also wanted to stave off accusations of bias against conservative posters.

  • The Dark Arts of Disinformation Through a Historical Lens

    History matters because sometimes it repeats itself. In his pioneering analysis of modern disinformation warfare from a historical perspective, Thomas Rid posits from the outset that “only by taking careful and accurate measure of the fantastic past of disinformation can we comprehend the present, and fix the future.”

  • The Kremlin’s Disinformation Playbook Goes to Beijing

    The coronavirus pandemic is exposing a growing competition between democratic and authoritarian governments. Jessica Brandt and Torrey Tausing write that as the U.S. and Europe struggle to contain the virus at home, Russia and China are seizing the moment to enhance their international influence through information operations.