• Undeterred cyber adversaries require a more aggressive American response

    America is under attack. In this case, rather than bombs and bullets, undeterred adversaries are using the cyber domain. Every day, they launch thousands of cyberattacks against American individuals, companies, and government agencies—persistently and incrementally chipping away at our security. Bradley Bowman and Annie Fixler write in RealClear Defense that this relentless barrage may seem like an inevitable reality of 21st century life. However, given the stakes for American national security, simply shrugging and accepting the cyber status quo would be a dangerous mistake. The U.S. has established deterrence in other warfighting domains. Washington can—and must—do the same in the cyber domain.

  • A modest proposal for preventing election interference in 2020

    The years since the 2016 election have been a national trauma that the U.S. shouldn’t be eager to revisit. Yet almost no policy changes have been made as a result of what the country has learned from the Mueller investigation and related events. In this post, I’d like to start assembling a menu of possible reforms that address the lessons learned from what Lawfare sometimes calls L’Affaire Russe. Stewart Baker writes in Lawfare that this is a fraught exercise because the narratives about L’Affaire Russe have diverged so far between Trump supporters and Trump detractors that almost any proposal for change will implicitly contradict the narrative of one camp or the other. “So, to save time, here are my most salient biases in the matter: I’m generally comfortable with most of President Trump’s policy instincts; I’ve spent a lifetime working with intelligence and law enforcement professionals who do battle every day with very real enemies of the United States, Russia among them; and I believe in them and in making government work, which makes me uncomfortable with President Trump’s character and lack of policy-making fine-motor skills,” Baker writes. “With that mixed perspective, I am hopeful there may be room for at least some agreement on things we ought to do differently in future.”

  • Information operations in the digital age

    From the Cambridge Analytica scandal to the spread on social media of anti-Rohingya content in Myanmar and the interference with elections the world over, the past decade has seen democracies around the world become the target of a new kind of information operations.

  • ARCHANGEL: Securing national archives with AI and blockchain

    Researchers are using its state-of-the-art blockchain and artificial intelligence technologies to secure the digital government records of national archives across the globe – including the U.K., Australia, and the United States of America.

  • U.S. measles cases top record, putting measles elimination status at risk

    The U.S. Centers for Disease Control and Prevention (CDC) said Thursday that 971 cases of measles have been reported this year, topping the 1994 modern-record level, and it warned that the United States could lose its measles elimination status. Amid the growing measles crisis, the conspiracy-fueled anti-vaccination campaign of misinformation continues unabated on social media. DHS mulls a travel ban on measles-infected individuals.

  • Outsmarting deep fakes: AI-driven imaging system protects authenticity

    To thwart sophisticated methods of altering photos and video, researchers have demonstrated an experimental technique to authenticate images throughout the entire pipeline, from acquisition to delivery, using artificial intelligence (AI).

  • Facebook, Twitter shut down thousands of Iranian fake accounts

    Social media giant Facebook has announced that it removed dozens of accounts, pages, and groups linked to Iran for “coordinated inauthentic behavior.” The company also disabled several accounts on its sister-platform Instagram.

  • How Russia found a disinformation haven in America

    The Mueller Report definitively established that the Russians, both through the Main Intelligence Directorate (GRU) and the Internet Research Agency (IRA), undertook information operations campaigns. This has been reasonably clear for a long time. Rawi Abdelal Galit Goldstein write in the National Interest that framing the Russian disinformation campaign issue through the debate of whether Donald Trump could have won the presidency without Russian help, or whether the Trump campaign actively conspired with Russia, misses the point. “The goal of the information operations campaigns was not simply to elect Donald Trump president. Nor was it only to polarize American politics further. The point was, rather, to continue undermining America’s ability to agree on the true and not-true,” they write. And Russia’s strategy hinged on the fact that it is nearly impossible for people stuck in alternate realities with competing, incompatible truth claims to undertake civil discourse.

  • More than security: Passwords serve a personal purpose

    A study has shown that people build their passwords from personal information for a variety of reasons including to invoke important memories or achieve future goals. The study found around half of the respondents infused their passwords with autobiographical memories.

  • Russia’s would-be Windows replacement gets a security upgrade

    For the first time, Russia has granted its highest security rating to a domestically developed operating system, deeming Astra Linux suitable for communications of “special importance” across the military and the rest of the government. The designation clears the way for Russian intelligence and military workers who had been using Microsoft products on office computers to use Astra Linux instead. Patrick Tucker writes in Defense One that Although Russian officials used Windows for secure communications, they heavily modified the software and subjected Windows-equipped PCs to lengthy and rigorous security checks before putting the computers in use. The testing and analysis was to satisfy concerns that vulnerabilities in Microsoft operating systems could be patched to prevent hacking from countries like the United States. Such evaluations could take three years, according to the newspaper.

  • Minds, the “anti-Facebook,” has no idea what to do about all the neo-Nazis

    Minds is home to neo-Nazis, and wants its users to help decide what content stays on the site. Ben Makuch and Jordan Pearson write in Motherboard that Minds is a US-based social network that bills itself as being focused on transparency (its code is open source), free speech, and cryptocurrency rewards for users. Much of the recent media coverage around Minds, which launched in 2015, has focused on how it challenges social media giants and its adoption of cryptocurrency, while also noting that the site’s light-touch approach to content moderation has led to a proliferation of far-right viewpoints being shared openly on its platform.

  • Facebook’s dystopian definition of “fake”

    Every time another “fake video” makes the rounds, its menace gets rehashed without those discussing it establishing what “fakeness” means in the first place. The latest one came last week, a doctored video of Nancy Pelosi. President Donald Trump tweeted a reference to the video; his personal attorney Rudy Giuliani shared it, too, although Giuliani later deleted his post. Ian Bogost writes in The Atlantic that these sorts of events are insidious because it’s hard to form a response that isn’t a bad one. Talking about the video just gives its concocted message more oxygen. Ignoring it risks surrendering truth to the ignorant whims of tech companies. The problem is, a business like Facebook doesn’t believe in fakes. For it, a video is real so long as it’s content. And everything is content.

  • Unknowingly loading malicious content from “trusted” sites

    New research from CSIRO’s Data61, the data and digital specialist arm of Australia’s national science agency, questions the “trustability” of websites and in a world first quantifies the extent to which the trust model of today’s World Wide Web is fundamentally broken.

  • Doctored video of Nancy Pelosi shows social media giants ill-prepared for 2020

    Hours after House Speaker Nancy Pelosi addressed a conference Wednesday, a distorted video of the California Democrat’s conversation began spreading across the internet. The manipulated clip, slowed to make Pelosi sound as if she were slurring her words, racked up millions of views on Facebook the following day. It was posted to YouTube, and on Thursday night was given a boost on Twitter when Rudy Giuliani, President Trump’s personal lawyer and former mayor of New York, shared a link with his 318,000 followers. Sam Dean and Suhauna Hussain write in the Los Angeles Times that by Friday, the three social media giants were forced to respond to this viral instance of political fakery. How they dealt with the issue, three years after being blindsided by a wave of fake news and disinformation in the 2016 election cycle, may serve as a harbinger of what’s to come in 2020.

  • The many faces of foreign interference in European elections

    Citizens of the European Union’s 28 member states go to the polls this week to choose their representatives to the European Parliament. Following Russian interference in several high-profile elections over the past three years, European governments are on high alert for signs of such meddling on social media or in electoral IT systems. Recent events in Austria and Italy show that foreign authoritarian actors are finding other under-examined, but equally insidious ways to infiltrate campaigns and harm democracy in Europe.