• Don’t (Just) Blame Echo Chambers. Conspiracy Theorists Actively Seek Out Their Online Communities

    Why do people believe conspiracy theories? Is it because of who they are, what they’ve encountered, or a combination of both? The answer is important. Belief in conspiracy theories helps fuel climate change denial, anti-vaccination stances, racism, and distrust of the media and science. Our research shows that people who engage with conspiracy forums actively seek out sympathetic communities, rather than passively stumbling into problematic beliefs.

  • Online Disinformation and Emerging Tech: Are Democracies at Risk?

    Online disinformation campaigns supported by fundamental changes in military and geopolitical strategies of major players such as Russia and China harden tribal factions and undermine the security of infrastructure systems in targets such as the United States, as state and non-state actors mount increasingly sophisticated cyberattacks on democratic institutions, Brad Allenby writes. Whether the United States and other democracies are up to this challenge remains to be seen, he says.

  • U.S.-Funded Research, Scientists Help China’s Drive to Become World S&T Leader

    The U.S. government has so far failed to stop China from stealing intellectual property from American universities. Moreover, the Trump administration lacks a comprehensive strategy for dealing with the threat. These are the conclusions of a new report issued on Monday by the Permanent Subcommittee on Investigations of the Senate Committee on Homeland Security and Governmental Affairs. The report says problem is especially urgent because billions of dollars in taxpayer-funded research have “contributed to China’s global rise over the last twenty years” and to its goal of becoming a world leader in science and technology by 2050.

  • U.S. Investigating Universities over Russian, Chinese, Saudi Donations

    U.S. officials have asked MIT to turn over documents regarding the university’s contacts with foreign governments and donations from foreign sources, including those coming from Russia, China, and Saudi Arabia. The University of Maryland received a similar demand from the Education Department. MIT has been under scrutiny for a while, after accepting $300 million from Viktor Vekselberg, a Russian oligarch and a close ally of Vladimir Putin. Veklesberg is close to several of Donald Trump’s family members and members of the Trump organization. He was involved in the Moscow Trump Tower project. In 2018 MIT removed Veklesberg from its board — to which he was elected in 2013 — after the U.S. Treasury Department listed him and his business group among the Russian officials, “oligarchs,” and companies to be penalized for advancing Moscow’s “malign activities.”

  • Cryptocurrency and National Insecurity

    A recent exercise at Harvard’s Kennedy School explored the dangers of large sums of money being secretly sent to hostile nations. The exercise brought together administration veterans, career diplomats, and academics to dramatize a very real prospect — the rise of an encrypted digital currency that would upend the U.S. dollar’s dominance and effectively render ineffective economic sanctions, like those currently applied to North Korea.

  • Russian Hackers Attacked Me and Other Military Spouses. Why Can’t We Sue?

    In a systematic campaign aiming to sow panic and confusion, Russian government hackers, masquerading as ISIS fighters, have been hacking computers and smartphones of spouses of U.S. military personnel, stealing and distributing their personal and financial information, and spreading lies about the on the dark web. “Almost as astonishing as the discovery that Russia was behind the attacks was finding out that U.S. citizens have no legal recourse against foreign governments that target them online,” writes Lorri Volkman, whose husband serves in the military, and was attacked by Russian hackers four years ago.

  • New Report on Russia’s Online Operations: Pseudo-Think Tanks, Personas

    The Kremlin used many different techniques in its effective campaigns of interference in the politics of Western democracies, including the 2016 U.S. presidential election. One such technique is “narrative laundering” – the technique of moving a certain narrative from its state-run origins to the wider media ecosystem through the use of aligned publications, “useful idiots,” and, perhaps, witting participants. “Given that many of these tactics are analogs of those used in Cold-War influence operations, it seems certain that they will continue to be refined and updated for the internet era, and are likely to be used to greater effect,” a new report says.

  • Here’s How Russia Will Attack the 2020 Election. We’re Still Not Ready.

    In 2016, the GRU, Russia’s military intelligence branch, launched a massive, and successful disinformation campaign to change the way Americans were talking about the two candidates – Hillary Clinton and Donald Trump. Among the GRU’s most effective disinformation techniques was one known as “narrative laundering,” which aims to inject the Kremlin’s preferred stories – real, fake, or doctored — into mainstream American media. “It is quite possible that these exact techniques will be used again,” Renee DiResta, Michael McFaul, and Alex Stamos write. “And why shouldn’t they? We’ve done almost nothing to counter the threat.”

  • Fighting Deepfakes When Detection Fails

    Deepfakes intended to spread misinformation are already a threat to online discourse, and there is every reason to believe this problem will become more significant in the future. Automated deepfake detection is likely to become impossible in the relatively near future, as the approaches that generate fake digital content improve considerably.

  • Five Faces of Russia’s Soft Power: Far Left, Far Right, Orthodox Christian, Russophone, and Ethnoreligious Networks

    Does Russia exercise true “soft power”—the power of attraction—in any significant measure? Şener Aktürk writes that while some argue that the power Russia exerts is not really soft power, “I suggest Russia’s soft power may be at least as great as its hard power in international politics.” There are at least five different categories of foreign audiences that espouse a pro-Russian geopolitical identity – “In addition to pro-Russian far right parties and networks, which have attracted most of the attention of scholars and journalists, there are also far left, Orthodox Christian, Russophone, and various ethnoreligious and separatist groups that favor a pro-Russian geopolitical identity.”

  • Private Vendors Critical to Election Security Inadequately Supervised

    Private vendors build and maintain much of the election infrastructure in the United States with minimal oversight by the federal government. A new report presents the risks this poses to the security of our elections and offers a solution.

  • How Fake News Spreads Like a Real Virus

    When it comes to real fake news, the kind of disinformation that Russia deployed during the 2016 elections, “going viral” isn’t just a metaphor. Using the tools for modelling the spread of infectious disease, cyber-risk researchers at Stanford Engineering are analyzing the spread of fake news much as if it were a strain of Ebola.

  • Saudi “Twitter Spies” Broke No Federal Privacy Laws -- Because There Are None

    Privacy expert Mike Chapple of the University of Notre Dame says that the Saudi “Twitter Spies,” who were charged last week by the Justice Department for spying on behalf of Saudi Arabia, committed espionage — but broke no federal privacy laws because there are no such laws. Chapple says that Twitter failed to live up to industry-standard cybersecurity practices.

  • Can the United States Deter Election Meddling?

    The 2020 election is still a year away, but law enforcement officials are already sounding the alarm about foreign interference in the election. Leaders of the U.S. intelligence and law enforcement communities warn that Moscow is preparing to launch a similar effort next year. Joshua Rovner writes that cyber-meddling is a challenge, but that we should not despair.

  • Disinformation Agents Are Targeting Veterans in Run-Up to 2020 Election

    Disinformation campaigns are targeting U.S. veterans through social media, seeking to tap the group’s influential status in their communities and high voting turnout in order to influence elections and fuel discord. Katerina Patin writes that veterans present an ideal target for foreign actors. In addition to their social status and voting rate, veterans are also more likely to run for office and more likely to work in government than any other demographic.