• Here’s How Russia Will Attack the 2020 Election. We’re Still Not Ready.

    In 2016, the GRU, Russia’s military intelligence branch, launched a massive, and successful disinformation campaign to change the way Americans were talking about the two candidates – Hillary Clinton and Donald Trump. Among the GRU’s most effective disinformation techniques was one known as “narrative laundering,” which aims to inject the Kremlin’s preferred stories – real, fake, or doctored — into mainstream American media. “It is quite possible that these exact techniques will be used again,” Renee DiResta, Michael McFaul, and Alex Stamos write. “And why shouldn’t they? We’ve done almost nothing to counter the threat.”

  • Fighting Deepfakes When Detection Fails

    Deepfakes intended to spread misinformation are already a threat to online discourse, and there is every reason to believe this problem will become more significant in the future. Automated deepfake detection is likely to become impossible in the relatively near future, as the approaches that generate fake digital content improve considerably.

  • Germany: Far-Right Lawmaker Punished over Anti-Semitism

    German lawmakers on Wednesday, in a move which is unprecedented in modern German history, removed a far-right politician from his position as the chairman of the powerful Legal Affairs Committee of the Bundestag. The move came after the politician, Stephan Brandner, has repeatedly made anti-Semitic comments. All the parties in the Bundestag, except his own AfD party, voted to strip him of the committee’s chairmanship.

  • Private Vendors Critical to Election Security Inadequately Supervised

    Private vendors build and maintain much of the election infrastructure in the United States with minimal oversight by the federal government. A new report presents the risks this poses to the security of our elections and offers a solution.

  • National Labs Host DOE CyberForce Competition

    Five teams of college students will square off at the U.S. Department of Energy’s (DOE) Lawrence Berkeley National Laboratory (Berkeley Lab) on 16 November as part of DOE’s fifth CyberForce Competition. The event, held simultaneously at ten of the DOE’s National Laboratories across the United States, will challenge 105 college teams to defend a simulated energy infrastructure from cyberattacks. The CyberForce Competition is designed to inspire and develop the next generation of energy sector cybersecurity professionals by giving them a chance to hone their skills during interactive and realistic scenarios.

  • Firehosing: The Systemic Strategy that Anti-Vaxxers Are Using to Spread Misinformation

    “Firehosing” relies on pushing out as many lies as possible as frequently as possible. Firehosing is effective because its goal isn’t to persuade. It’s to rob facts of their power. “The strategy is effective for those trying to hold on to political power, and it’s the same for those who gain power from engaging in science denial,” Lucky Tran writes.

  • Vulnerabilities Affecting Billions of Computer Chips Discovered

    Security researchers discovered serious security vulnerabilities in computer chips made by Intel Corp. and STMicroelectronics. The flaws affect billions of laptop, server, tablet, and desktop users around the world. The security flaws could be used to steal or alter data on billions of devices.

  • How Fake News Spreads Like a Real Virus

    When it comes to real fake news, the kind of disinformation that Russia deployed during the 2016 elections, “going viral” isn’t just a metaphor. Using the tools for modelling the spread of infectious disease, cyber-risk researchers at Stanford Engineering are analyzing the spread of fake news much as if it were a strain of Ebola.

  • Don’t Rush Quantum-Proof Encryption, Warns NSA Research Director

    In 1994, Peter Shor, a mathematician, discovered a way to crack the codes that banks, e-commerce platforms, and intelligence agencies use to secure their digital information. “Shor’s algorithm” drastically shortened the time it took to find the prime numbers that underlie public-key cryptography, making codes that typically take thousands of years to break solvable in a matter of months. Jack Corrigan writes that there was a catch: Shor’s algorithm could run only on quantum computers, and they did not exist twenty-five years ago. They are much closer today, and this has many security experts worried.

  • Saudi “Twitter Spies” Broke No Federal Privacy Laws -- Because There Are None

    Privacy expert Mike Chapple of the University of Notre Dame says that the Saudi “Twitter Spies,” who were charged last week by the Justice Department for spying on behalf of Saudi Arabia, committed espionage — but broke no federal privacy laws because there are no such laws. Chapple says that Twitter failed to live up to industry-standard cybersecurity practices.

  • Can the United States Deter Election Meddling?

    The 2020 election is still a year away, but law enforcement officials are already sounding the alarm about foreign interference in the election. Leaders of the U.S. intelligence and law enforcement communities warn that Moscow is preparing to launch a similar effort next year. Joshua Rovner writes that cyber-meddling is a challenge, but that we should not despair.

  • The Senate Examines Threats to the Homeland

    On Tuesday, Nov. 5, the Senate Homeland Security and Governmental Affairs Committee held a hearing on the evolving threats facing the United States. In their written and opening remarks, the witnesses outlined a dizzyingly broad array of threats—from domestic and international terrorism to transnational organized crime, cyber and economic espionage, election interference, data insecurity, and potential chemical and biological attacks on the homeland. As the hearing wore on, senators’ questions and witness testimony narrowed in scope, focusing primarily on three aspects of America’s security challenges: how to optimize information sharing to combat domestic terrorism; how to counter Chinese cyber and counterintelligence operations; and how to address the growing problems posed by new technologies, namely, ransomware, cryptocurrency and unmanned aerial systems (UASs).

  • The Trolls Are Everywhere. Now What Are We Supposed to Do?

    Forget the decline of gatekeepers. Imagine a world bereft of gates and uncrossable lines, with no discernible rules. Andrew Marantz’s just published book, Antisocial: Online Extremists, Techno-Utopians, and the Hijacking of the American Conversation, offers a detailed and disturbing study of how the social media platforms, rolled out over the last decade by a group of nerdy but naïve Silicon Valley entrepreneurs, have been hijacked by “edge lords” — another name for a collection of nihilists, right-wing nationalists, conspiracy purveyors, white supremacists, and more, whose goal is to downgrade the discourse in a way that would soon corrode the entire system. “The ranking algorithms on social media laid out clear incentives: provoke as many activating emotions as possible; lie, spin, dog-whistle; drop red pill after red pill; step up to the line repeatedly, in creative new ways,” Marantz writes. Public discourse is being replaced by the dance of discord and enragement and noxiousness.

  • Disinformation Agents Are Targeting Veterans in Run-Up to 2020 Election

    Disinformation campaigns are targeting U.S. veterans through social media, seeking to tap the group’s influential status in their communities and high voting turnout in order to influence elections and fuel discord. Katerina Patin writes that veterans present an ideal target for foreign actors. In addition to their social status and voting rate, veterans are also more likely to run for office and more likely to work in government than any other demographic.

  • Online Disinformation and Political Discourse: Applying a Human Rights Framework

    The framers of the Universal Declaration of Human Rights (UDHR) saw human rights as a fundamental safeguard for all individuals against the power of authority. Although some digital platforms now have an impact on more people’s lives than does any one state authority, the international community has been slow to measure and hold to account these platforms’ activities by reference to human rights law. Kate Jones writes that “Although international human rights law does not impose binding obligations on digital platforms, it offers a normative structure of appropriate standards by which digital platforms should be held to account. Because of the impact that social media can have, a failure to hold digital platforms to human rights standards is a failure to provide individuals with the safeguards against the power of authority that human rights law was created to provide.”