• Putin’s doctrine blends “bare-face lying,” “social media disinformation,” and “criminal thuggery”: MI5 Director

    In a speech on Wednesday, MI5 Director General Andrew Parker discussed the security challenges the West is facing, chief among them the threat from Russia. Parker said the threat from Russia is a “hybrid threat,” as Russia is a practitioner of a doctrine “blending media manipulation, social media disinformation and distortion with new and old forms of espionage, high levels of cyberattacks, military force, and criminal thuggery.” Parker added: “Our democracies, our societies and our bonds of partnership are strong. But we must not be complacent about the longer-term potential impact of this [Russian] activity on the international rules-based order that supports our security and prosperity.”

  • Kaspersky to move data center from Russia to Switzerland

    Kaspersky Lab, the Moscow-based anti-virus maker will open a Swiss data center after allegations that Russian hackers exploited the company’s software to spy on customers. The said the new location would help it “rebuild trust.”

  • The Facebook ad dump shows the true sophistication of Russia’s influence operation

    The massive trove of Facebook ads House Intelligence Committee Democrats released last Tuesday offers a breathtaking view of the true sophistication of the Russian government’s digital operations during the 2016 presidential election. Many stories have already been written about the U.S. intelligence community’s investigation of the hacking operation Russian intelligence services carried out to influence the election in favor of then-candidate Donald Trump. Derek Hawkins writes that the more than 3,000 “incredibly specific and inflammatory” Russian ads released last week allow us for the first time to “have a swath of empirical and visual evidence of Russia’s disinformation campaign.”

  • War on fake news could be won with the help of behavioral science

    Facebook CEO Mark Zuckerberg recently acknowledged his company’s responsibility in helping create the enormous amount of fake news that plagued the 2016 election – after earlier denials. Yet he offered no concrete details on what Facebook could do about it. Fortunately, there’s a way to fight fake news that already exists and has behavioral science on its side: the Pro-Truth Pledge project. I was part of a team of behavioral scientists that came up with the idea of a pledge as a way to limit the spread of misinformation online. Two studies that tried to evaluate its effectiveness suggest it actually works.

  • Russia conducted "unprecedented, coordinated" attacks on U.S. voting systems in 2016: Senate Intelligence Committee

    Hackers affiliated with the Russian government conducted an “unprecedented, coordinated” campaign against the U.S. voting system, including successfully penetrating a few voter-registration databases in 2016, the Senate Intelligence Committee has concluded. The cyberattacks targeted at least eighteen states, and possibly three more. “Russian actors scanned databases for vulnerabilities, attempted intrusions, and in a small number of cases successfully penetrated a voter registration database,” the committee said in an interim report releaed Tuesday.

  • Russian bots did “influence the General Election by promoting Jeremy Corbyn”: Study

    An examination by Swansea University and the Sunday Times found that Russian government bots distributed thousands of fake posts on social media in the run-up to Britain’s election last June, aiming to help Labor Party leader Jeremy Corbyn win the election. He did not win, but still achieved unexpectedly good results for the Labor Party – results which defied predictions — in the process weakening Prime Minister Theresa May. The methodology of the Russian government’s pro-Corbyn social media campaign was similar to the Kremlin’s broad disinformation campaign to help Donald Trump win the 2016 U.S. presidential election.

  • The “European Approach” to fighting disinformation: Lessons for the United States

    The European Commission published a communication on 26 April to the European Council and Parliament outlining the “European Approach” to combatting disinformation. The report provides an important opportunity for reflection across the transatlantic space, as the United States seeks to inoculate its democracy from ongoing hostile foreign interference activities. Takeaways from the “European Approach” to fighting disinformation can help U.S. policymakers develop more targeted policy measures, and identify potential shortcomings in the U.S. response.

  • Enemies of the state: Russia tracked Russian émigrés in the U.S.

    Last month the United States expelled 60 Russian diplomats in solidarity with the United Kingdom,. after Russian intelligence operatives poisoned former Russian spy Sergei Skripal and his daughter in Salisbury, England in March. Among those expelled were intelligence operatives who were tracking Russian defectors and their families in the United States, probably setting the stage for killing some of them as “enemies of the state.”

  • European Commission to call out Russia for “information warfare”

    The European Commission is set to single out Russia directly for what it calls Moscow’s “information warfare” as part of EU efforts to fight back against online disinformation campaigns considered a threat to European security. The draft of a communique seen by RFE/RL states that “mass online disinformation campaigns are being widely used by a range of domestic and foreign actors to sow distrust and create societal tensions, with serious potential consequences for our security.”

  • Federal IT, communications technology supply chain vulnerable to Chinese sabotage, espionage

    A new report examines vulnerabilities in the U.S. government information and communications technology (ICT) supply chains posed by China. The report issues a warning about the extent to which China has penetrated the technology supply chain, and calls on the U.S. government and industry to develop a comprehensive strategy for securing their technology and products from foreign sabotage and espionage.

  • Deterring foreign interference in U.S. elections

    A new study analyzes five million political ads on hot-button issues which ran on Facebook in the run-up to the 2016 election. Voters in swing states like Wisconsin and Pennsylvania were disproportionately targeted with ads featuring divisive issues like guns, immigration, and race relations. The divisive ads were purchased by 228 groups – 121 of these groups had no publicly trackable information.

  • Ten legislative proposals to defend America against foreign influence operations

    More than a year after Russia’s broad hacking and disinformation campaign of interference in the 2016 presidential election, and with midterm elections looming on the horizon, Congress and the Trump administration have not taken any clear action to increase U.S. defenses against the foreign interference threat. There are important steps we can, and must, take to defend our institutions against adversaries who seek to undermine them. Many of Russia’s tactics have exploited vulnerabilities in our societies and technologies, and loopholes in our laws. Some of the steps necessary to defend ourselves will involve long-term work, others will require clear action by the Executive Branch to ensure Americans are united against the threat we face, and steps to both deter and raise the costs on such actions.

  • Tracking illicit Russian financial flows

    Trillions of dollars in capital flows into the United States annually, and trillions of dollars in payments are cleared through New York daily. No one knows exactly whom the funds belong to, where they are held, or how they are deployed. No one knows because the U.S. government does not track the money — but it could if it wanted to. What is known is that Russia, other countries of the Commonwealth of Independent States, and China are the primary drivers of non-transparent capital flows worldwide.

  • New strategies for countering Russian social media influence in Eastern Europe

    Russia is waging a social media campaign in the Baltics, Ukraine, and nearby states to sow dissent against neighboring governments, as well as NATO and the European Union. “Nowhere is this threat more tangible than in Ukraine, which has been an active propaganda battleground since the 2014 Ukrainian revolution,” said the lead author of a new RAND report. “Other countries in the region look at Russia’s actions and annexation of Crimea and recognize the need to pay careful attention to Russia’s propaganda campaign.”

  • It’s not just Facebook: Countering Russia’s social media offensive

    Russian influence operations exploit the vulnerabilities of social media platforms to disseminate false narratives and amplify divisive content in order to undermine democracies, divide societies, and weaken Western alliances. In conducting these operations, the Kremlin employs a variety of tools across the social media space, including fake accounts/personas, political advertisements, bot networks, and traditional propaganda outlets. Additionally, Russian influence operations utilize a range of social media platforms, each with a different role, to distract public discussion, foment social unrest, and muddle the truth.