• Twitter Removes 170,000 Accounts Used by China, Russia, and Turkey to Spread Disinformation

    Twitter said Thursday it had removed more than 170,000  accounts used by China, Russia and Turkey to spread disinformation. The accounts were part of a network used to push propaganda, attack critics of the government, and spread misinformation. A majority of the accounts were linked to China.

  • EU: China, Russia Waging Broad Pandemic Disinformation Campaign to Deepen Crisis

    The European Union, in an unusually blunt language, has accused Russia and China of a running a broad, sustained, and “targeted” disinformation campaign inside the European Union, aiming to deepen and lengthen the coronavirus pandemic crisis and its negative medical, economic, and social effects. The EU has criticized Russia in the past for its sophisticated disinformation campaign aiming to weaken the West and undermine liberal democracies, but the direct criticism of China is a break from the EU recent approach, which saw it tiptoeing around China’s many transgressions.   

  • U.S. Accuses Foreign Actors of Inflaming Tensions over Floyd Killing

    U.S. adversaries are starting to weaponize protests that have gripped parts of the country “to sow divisiveness and discord,” according to top law enforcement officials who refused to share additional details. The U.S. Justice Department and the FBI allege that unnamed countries are actively manipulating information to make the situation in the United States worse.

  • Twitter Suspends Fake Antifa Account Created by White Nationalists to Incite Violence

    Twitter suspended a fake account, created by white nationalist group Identity Evropa, which pretended to be affiliated with Black Lives Matter and incited violence. The account called upon African American participants in the protests to use violence against law enforcement and places of business. “Tonight’s the night, Comrades,” one tweet had said, before encouraging users to “take what’s ours.”

  • Virality Project (US): Marketing Meets Misinformation

    Pseudoscience and government conspiracy theories swirl on social media, though most of them stay largely confined to niche communities. In the case of COVID-19, however, a combination of anger at what some see as overly restrictive government policies, conflicting information about treatments and disease spread, and anxiety about the future has many people searching for facts…and finding misinformation. This dynamic creates an opportunity for determined people and skilled marketers to fill the void - to create content and produce messages designed to be shared widely.

  • White Supremacist Groups Thriving on Facebook

    Dozens of white supremacist groups are operating freely on Facebook, allowing them to spread their message and recruit new members. The findings, more than two years after Facebook hosted an event page for the deadly “Unite the Right” rally in Charlottesville, Virginia, cast doubt on the company’s claims that it’s effectively monitoring and dealing with hate groups. What’s more, Facebook’s algorithms create an echo chamber that reinforces the views of white supremacists and helps them connect with each other.

  • Social Media Platforms Can Contribute to Dehumanizing Other People

    A recent analysis of discourse on Facebook highlights how social media and an individual’s sense of identity can be used to dehumanize entire groups of people. “Fundamentally, we wanted to examine how online platforms can normalize hatred and contribute to dehumanization,” says one researcher. “And we found that an established model of the role identity plays in intractable conflicts seems to explain a great deal of this behavior.”

  • U.S.-Funded Website Spreading COVID Misinformation in Armenia

    U.S. taxpayer money has funded a controversial health news website in Armenia that is spreading “incredibly dangerous” COVID-19 misinformation. Public health experts in the U.S. and Armenia denounced this content – which includes claims that vaccines currently being developed are actually “biological weapons.”

  • Facebook Knew Its Algorithms Promoted Extremist Groups, but Did Nothing: Report

    A Facebook Inc. team had a blunt message for senior executives. The company’s algorithms weren’t bringing people together. They were driving people apart. “Our algorithms exploit the human brain’s attraction to divisiveness,” read a slide from a 2018 presentation. “If left unchecked,” it warned, Facebook would feed users “more and more divisive content in an effort to gain user attention & increase time on platform.” The Wall Street Journal reports that the main reason behind Facebook’s decision to do nothing was the fear that any content moderation measures would disproportionately affect right-wing pages, politicians, and other parts of the user base that drove up engagement. The company also wanted to stave off accusations of bias against conservative posters.

  • The Dark Arts of Disinformation Through a Historical Lens

    History matters because sometimes it repeats itself. In his pioneering analysis of modern disinformation warfare from a historical perspective, Thomas Rid posits from the outset that “only by taking careful and accurate measure of the fantastic past of disinformation can we comprehend the present, and fix the future.”

  • The Kremlin’s Disinformation Playbook Goes to Beijing

    The coronavirus pandemic is exposing a growing competition between democratic and authoritarian governments. Jessica Brandt and Torrey Tausing write that as the U.S. and Europe struggle to contain the virus at home, Russia and China are seizing the moment to enhance their international influence through information operations.

  • Triad of Disinformation: How Russia, Iran, & China Ally in a Messaging War against America

    China has long deployed widespread censorship, propaganda, and information manipulation efforts within its borders, but information operations directed at foreign audiences have generally focused on framing China in a positive way. In the last two months, however, Beijing, showing itself willing to emulate Russia’s approach to information campaigns, has conducted a much more ambitious effort not only to shape global perspectives about what’s occurring inside China, but to influence public opinion about events outside its borders.

  • Germany: Revised Domestic Surveillance Bill Submitted to Bundestag

    A draft law to reform Germany’s BfV domestic intelligence agency is to be re-submitted to parliament after long debate. It will allow German domestic intelligence and law enforcement to conduct electronic surveillance of telephone calls and SMS text services, including encrypted “chats” via services such as WhatsApp and Telegram, but will  not allow the use of cyber “Trojan” trawling tools.

  • Australian Investigators Debunk 5G-COVID-19 Conspiracy Theory

    One of the more bizarre conspiracy theories recently created is the one claiming a connection between 5G technology and the virus. Believers argue that that either 5G was responsible for coronavirus, due to the construction of 5G networks in Wuhan, or for “poisoning cells” which created coronavirus. An Australian parliamentary investigation has now debunked this particular piece of misinformation.

  • How “Truth Decay” Is Harming America’s Coronavirus Recovery

    How is it possible that Americans are polarized along party lines even on something as seemingly apolitical as a virus? Alex Ward writes in Vox that one big reason is what Jennifer Kavanagh, a senior political scientist at the Rand Corporation, calls “truth decay.” Simply put, Americans no longer rely on facts and data as much as they should. That’s a problem at any time, but it’s especially troubling during a pandemic, when people need the best, most reliable information to stay safe.