• Flaws in metrics for user login systems

    How good is the research on the success or failure of the system that verifies your identity when you log into a computer, smartphone or other device? Chances are it’s not good, and that’s a major security and privacy issue that should be fixed.

  • Why do some people believe the Earth is flat?

    Why, despite overwhelming scientific evidence to the contrary, is the flat-Earth movement gaining traction in the twenty-first century? One expert says that, in part, it is due to a general shift toward populism and a distrust in the views of experts and the mainstream media. There is an “increasing distrust in what we once considered to be the gatekeepers of knowledge – like academics, scientific agencies, or the government,” she says. In this kind of environment, “it becomes really easy for once-fringe views to gain traction.”

  • The Russia investigation will continue

    Special Counsel Robert Mueller’s Russia probe is over, but the FBI is almost certain to continue its counterintelligence investigation into Russian espionage efforts related to the 2016 election. The FBI will continue to search for Americans working on behalf of the Kremlin. John Sipher writes in The Atlantic that the inability to establish that the Trump campaign conspired in a “tacit or express” agreement with the Russian government is not surprising. Most espionage investigations come up empty unless and until they get a lucky break. That does not mean there was no espionage activity in relation to the 2016 election. Every previous Russian political-warfare campaign was built on human spies. Russian “active measures”—propaganda, information warfare, cyberattacks, disinformation, use of forgeries, spreading conspiracies and rumors, funding extremist groups and deception operations—rely on human actors to support and inform their success. Counterintelligence professionals must doubt that Russia could have pulled off its election-interference effort without the support of spies burrowed into U.S. society or institutions.

  • Russia’s long game: Paralyze Europe’s ability to act in its own self-interest

    With the European parliamentary elections approaching in less than a month’s time, Russia’s persistent disinformation campaign to subvert and undermine Europe’s democratic institutions is a source of growing worry. Russia is playing a long game in Europe: its objective is not merely to influence the outcome of any particular election, but rather to broadly subvert the efficacy of Europe’s democratic institutions, fuel widespread social fragmentation and mistrust, and ultimately paralyze Europe’s ability to act in its own self-interest and to defend our values.

  • Christchurch-style terrorism reaches American shores

    In wake of the attack at Chabad synagogue of Poway, California,  it is important to examine digital communications surrounding the shooting and what they suggest about future terrorist activity.

  • Blinded and confused: Security vulnerabilities in “smart home” IoT devices

    Researchers have identified design flaws in “smart home” Internet-of-Things (IoT) devices that allow third parties to prevent devices from sharing information. The flaws can be used to prevent security systems from signaling that there has been a break-in or uploading video of intruders.

  • Influence operations in the digital age

    Influence operations in the digital age are not merely propaganda with new tools. They represent an evolved form of manipulation that presents actors with endless possibilities — both benign and malignant. While the origins of this new form are semi-accidental, it has nonetheless opened up opportunities for the manipulation and exploitation of human beings that were previously inaccessible. Zac Rogers, Emily Bienvenue, and Maryanne Kelton write in War on the Rocks that digital influence operations, now conducted across the whole of society, indicate that we are only at the beginning of a new era of population-centric competition. With regard to propaganda, the fundamental distinction between the old and the new lies in the difference between participatory and passive forms of information consumption.

  • China: Determined to dominate cyberspace and AI

    China is chasing dominance in emerging artificial intelligence (AI) technologies in both the private and military sectors, as a central part of its effort to be the leading global cyber power, Chris C. Demchak writes in the Bulletin of the Atomic Scientists. The rise of AI – a subset of cyber as are machine learning, quantum computing, and other new technologies – does not herald a new arms race equivalent to that of the Cold War. Rather, the concern should be on the profound disruption to the existing Westernized global order. In the 1990s, Western nations, led by the United States, created what Demchak calls a “Westernized national creation”: cyberspace. Cyberspace, however, has created a multitude of ubiquitous, embedded vulnerabilities whose easy exploitation directly accelerated the rise of an otherwise impoverished authoritarian and aggressive China. Today, no single democracy has the scale and sufficient resources alone to match the foreknowledge and strategic coherence of the newly confident and assertive China. There is thus a need to create a Cyber Operational Resilience Alliance (CORA) to provide the scale and collective strategic coherence required to ensure the future wellbeing and security of democracy in an overwhelmingly authoritarian, post-Western, cybered world.

  • Studying Russian disinformation campaigns

    An interdisciplinary research team from communications, anthropology, and political science will study Russian disinformation campaigns in three former Soviet republics as part of a $1.6 million Minerva research grant awarded through the U.S. Department of Defense.

  • Tech fixes cannot protect us from disinformation campaigns

    More than technological fixes are needed to stop countries from spreading disinformation on social media platforms like Facebook and Twitter, according to two experts. They argue that policymakers and diplomats need to focus more on the psychology behind why citizens are so vulnerable to disinformation campaigns.

  • Enabling more comprehensive tests on high-risk software

    We entrust our lives to software every time we step aboard a high-tech aircraft or modern car. A long-term research effort has developed new tools to make this type of safety-critical software even safer.

  • New Zealand, France leading an effort to ban terrorists from social media

    New Zealand and France will host a meeting with technology companies and world leaders to develop a strategy to block terrorists from social media. The meeting comes in the wake of the March shootings at two mosques in Christchurch.

  • White supremacists use social media aid, abet terror

    Before carrying out mass shooting attacks in Pittsburgh and New Zealand, white supremacist terrorists Robert Bowers and Brenton Tarrant frequented fringe social networking sites which, according to a new study, serve as echo chambers for the most virulent forms of anti-Semitism and racism, and active recruiting grounds for potential terrorists.

  • Can WiFi networks be completely secure?

    There are many ways in which hackers and crackers can break into a Wi-Fi network. It is trivial if the network uses out of date security protocols or weak passwords. But even if the system is set up with the latest security measures, strong passwords, and firewall and malware protection, there are still ways and means that a malicious third party might access such a network.

  • Actively used private keys on the ethereum blockchain facilitate cryptocurrency theft

    Researchers at Independent Security Evaluators (ISE) have discovered 732 actively used private keys on the Ethereum blockchain. The researchers also found that poorly implemented private key generation is also facilitating the theft of cryptocurrency.