• New Report on Russia’s Online Operations: Pseudo-Think Tanks, Personas

    The Kremlin used many different techniques in its effective campaigns of interference in the politics of Western democracies, including the 2016 U.S. presidential election. One such technique is “narrative laundering” – the technique of moving a certain narrative from its state-run origins to the wider media ecosystem through the use of aligned publications, “useful idiots,” and, perhaps, witting participants. “Given that many of these tactics are analogs of those used in Cold-War influence operations, it seems certain that they will continue to be refined and updated for the internet era, and are likely to be used to greater effect,” a new report says.

  • Why Cyber Operations Do Not Always Favor the Offense

    Among policymakers and analysts, the assumption that cyberspace favors the offense is widespread. Those who share this assumption have been urging the U.S. government to prioritize offensive cyber operations. Rebecca Slayton writes that the belief in offense dominance is understandable – but mistaken: A focus on offense “increases international tensions and states’ readiness to launch a counter-offensive after a cyberattack, and it often heightens cyber vulnerabilities,” she writes.

  • Lessons from the Cyberattack on India’s Largest Nuclear Power Plant

    In early September, a cyberattack occurred at the Kudankulam nuclear power plant in India. The Indian nuclear monitoring agency finally admitted that the nuclear plant was hacked, and on 30 October Indian government officials acknowledged the intrusion. “As the digitalization of nuclear reactor instrumentation and control systems increases, so does the potential for malicious and accidental cyber incidents alike to cause harm,” Alexander Campbell and Vickram Singh write.

  • Here’s How Russia Will Attack the 2020 Election. We’re Still Not Ready.

    In 2016, the GRU, Russia’s military intelligence branch, launched a massive, and successful disinformation campaign to change the way Americans were talking about the two candidates – Hillary Clinton and Donald Trump. Among the GRU’s most effective disinformation techniques was one known as “narrative laundering,” which aims to inject the Kremlin’s preferred stories – real, fake, or doctored — into mainstream American media. “It is quite possible that these exact techniques will be used again,” Renee DiResta, Michael McFaul, and Alex Stamos write. “And why shouldn’t they? We’ve done almost nothing to counter the threat.”

  • Fighting Deepfakes When Detection Fails

    Deepfakes intended to spread misinformation are already a threat to online discourse, and there is every reason to believe this problem will become more significant in the future. Automated deepfake detection is likely to become impossible in the relatively near future, as the approaches that generate fake digital content improve considerably.

  • Germany: Far-Right Lawmaker Punished over Anti-Semitism

    German lawmakers on Wednesday, in a move which is unprecedented in modern German history, removed a far-right politician from his position as the chairman of the powerful Legal Affairs Committee of the Bundestag. The move came after the politician, Stephan Brandner, has repeatedly made anti-Semitic comments. All the parties in the Bundestag, except his own AfD party, voted to strip him of the committee’s chairmanship.

  • Private Vendors Critical to Election Security Inadequately Supervised

    Private vendors build and maintain much of the election infrastructure in the United States with minimal oversight by the federal government. A new report presents the risks this poses to the security of our elections and offers a solution.

  • National Labs Host DOE CyberForce Competition

    Five teams of college students will square off at the U.S. Department of Energy’s (DOE) Lawrence Berkeley National Laboratory (Berkeley Lab) on 16 November as part of DOE’s fifth CyberForce Competition. The event, held simultaneously at ten of the DOE’s National Laboratories across the United States, will challenge 105 college teams to defend a simulated energy infrastructure from cyberattacks. The CyberForce Competition is designed to inspire and develop the next generation of energy sector cybersecurity professionals by giving them a chance to hone their skills during interactive and realistic scenarios.

  • Firehosing: The Systemic Strategy that Anti-Vaxxers Are Using to Spread Misinformation

    “Firehosing” relies on pushing out as many lies as possible as frequently as possible. Firehosing is effective because its goal isn’t to persuade. It’s to rob facts of their power. “The strategy is effective for those trying to hold on to political power, and it’s the same for those who gain power from engaging in science denial,” Lucky Tran writes.

  • Vulnerabilities Affecting Billions of Computer Chips Discovered

    Security researchers discovered serious security vulnerabilities in computer chips made by Intel Corp. and STMicroelectronics. The flaws affect billions of laptop, server, tablet, and desktop users around the world. The security flaws could be used to steal or alter data on billions of devices.

  • How Fake News Spreads Like a Real Virus

    When it comes to real fake news, the kind of disinformation that Russia deployed during the 2016 elections, “going viral” isn’t just a metaphor. Using the tools for modelling the spread of infectious disease, cyber-risk researchers at Stanford Engineering are analyzing the spread of fake news much as if it were a strain of Ebola.

  • Don’t Rush Quantum-Proof Encryption, Warns NSA Research Director

    In 1994, Peter Shor, a mathematician, discovered a way to crack the codes that banks, e-commerce platforms, and intelligence agencies use to secure their digital information. “Shor’s algorithm” drastically shortened the time it took to find the prime numbers that underlie public-key cryptography, making codes that typically take thousands of years to break solvable in a matter of months. Jack Corrigan writes that there was a catch: Shor’s algorithm could run only on quantum computers, and they did not exist twenty-five years ago. They are much closer today, and this has many security experts worried.

  • Saudi “Twitter Spies” Broke No Federal Privacy Laws -- Because There Are None

    Privacy expert Mike Chapple of the University of Notre Dame says that the Saudi “Twitter Spies,” who were charged last week by the Justice Department for spying on behalf of Saudi Arabia, committed espionage — but broke no federal privacy laws because there are no such laws. Chapple says that Twitter failed to live up to industry-standard cybersecurity practices.

  • Can the United States Deter Election Meddling?

    The 2020 election is still a year away, but law enforcement officials are already sounding the alarm about foreign interference in the election. Leaders of the U.S. intelligence and law enforcement communities warn that Moscow is preparing to launch a similar effort next year. Joshua Rovner writes that cyber-meddling is a challenge, but that we should not despair.

  • The Senate Examines Threats to the Homeland

    On Tuesday, Nov. 5, the Senate Homeland Security and Governmental Affairs Committee held a hearing on the evolving threats facing the United States. In their written and opening remarks, the witnesses outlined a dizzyingly broad array of threats—from domestic and international terrorism to transnational organized crime, cyber and economic espionage, election interference, data insecurity, and potential chemical and biological attacks on the homeland. As the hearing wore on, senators’ questions and witness testimony narrowed in scope, focusing primarily on three aspects of America’s security challenges: how to optimize information sharing to combat domestic terrorism; how to counter Chinese cyber and counterintelligence operations; and how to address the growing problems posed by new technologies, namely, ransomware, cryptocurrency and unmanned aerial systems (UASs).