• Using AI, machine learning to understand extent of online hate

    The Anti-Defamation League’s (ADL) Center for Technology and Society (CTS) announced preliminary results from an innovative project that uses artificial intelligence, machine learning, and social science to study what is and what isn’t hate speech online. The project’s goal is to help the tech industry better understand the growing amount of hate online. CTS has collaborated with the University of California at Berkeley’s D-Lab since April 2017 to develop the Online Hate Index. ADL and the D-Lab have created an algorithm that has begun to learn the difference between hate speech and non-hate speech. The project has completed its first phase and its early findings are described in a report released today. In a very promising finding, ADL and the D-Lab found the learning model identified hate speech reliably between 78 percent and 85 percent of the time.

  • Trump supporters, extreme right “share widest range of junk news”: Study

    A network of Donald Trump supporters shares the widest range of “junk news” on Twitter, and a network of extreme far-right conservatives on Facebook, according to analysis by Oxford University. The Oxford researchers find that on Twitter, a network of Trump supporters shares the widest range of junk news and circulates more junk news than all other political audience groups combined. On Facebook, extreme hard right pages – distinct from Republican pages – both share the widest range and circulate the largest volume of junk news compared with all the other audiences. Specifically, a group of “hard conservatives” circulates the widest range of junk news and accounts for the majority of junk news traffic in the sample. Junk news sources are defined as deliberately publishing misleading, deceptive, or incorrect information purporting to be real news about politics, economics, or culture. This type of content can include various forms of extremist, sensationalist, and conspiratorial material, as well as masked commentary and fake news.

  • “Jackpotting” drains millions from U.S. ATMs

    ATM machines across the country are being targeted by a wave of criminals in search of an illegal high-tech payday. The Secret Service calls this phenomenon “jackpotting,” and are warning U.S. bank attacks are imminent. It is a modern-day version of a bank robbery, but no weapons are used — only malware, a small device or two and a special key that can be purchased on the Internet. When cyberattackers take control of the machine, cash spews out of the ATM like a Las Vegas jackpot. ASU professor helps combat cyberattacks though intelligence-gathering.

  • Russian Tumblr trolls posed as black activists to stoke racial resentment ahead of 2016 U.S. election

    Internet trolls working for the Russian government posed as black activists on Tumblr to share political messages before the 2016 U.S. presidential election, BuzzFeed reports. As was the case with the fake accounts created by Russian government operatives on other social media platforms such as Facebook, Twitter, and Instagram, the fake Tumblr accounts aimed to help Donald Trump win the 2016 election by spreading messages which stoked racial and ethnic resentment and intensified political polarization. A digital forensic analysis tied the fake Tumblr accounts to the St. Petersburg-based Internet Research Agency (IRA), a hacking and disinformation organization employed by the Kremlin to disseminate fake news and commentary on social media as part of the broad Kremlin campaign to weaken Western democracies and undermine organizations such as NATO and the EU.

  • Faraday rooms, air gaps can be compromised, and leak highly sensitive data

    Faraday rooms or “cages” designed to prevent electromagnetic signals from escaping can nevertheless be compromised and leak highly sensitive data, according to new studies. Air-gapped computers used for an organization’s most highly sensitive data might also be secluded in a hermetically-sealed Faraday room or enclosure, which prevents electromagnetic signals from leaking out and being picked up remotely by eavesdropping adversaries. Researchers from Ben-Gurion University showed for the first time that a Faraday room and an air-gapped computer that is disconnected from the internet will not deter sophisticated cyber attackers.

  • Digital dark age fears stoked by Davos elite doing little to address cybersecurity

    Business leaders who recently convened in Davos for the annual World Economic Forum fretted over the various catastrophes that could hit the globe hard and – given the recent spate of cyberattacks – cybersecurity was high up on the agenda. The end result was the launch of a Global Center for Cybersecurity (GCC) with a clear mission to “prevent a digital dark age.” The GCC undoubtedly offers a reasonable proposition to nation states, by urging them to collaborate on overcoming cyber threats in a coordinated way. But for such a noble goal to work, it requires deeper resolve to deliver and a level of national commitment unprecedented over previous efforts. Given the increased global uncertainty, we are yet to have faith.

  • Record-breaking efficiency for secure quantum memory storage

    Researchers have broken through a key barrier in quantum memory performance. Their work has enabled the first secure storage and retrieval of quantum bits. The researchers have more than doubled the efficiency of optical qubit storage—from 30 percent to close to 70 percent—making secure storage and retrieval possible. Quantum memory is essential for future quantum networks. The ability to synchronize quantum bits has applications in long-distance quantum communication protocols or computing algorithms. With efficiency at well over 50 percent, quantum storage now enables protocol security.

  • Misinformation campaigns, social media, and science

    In some key domains of public life there appear to be coordinated efforts to undermine the reputation of science and innovation. Scientists now protest in the streets just to get governments to base policy on scientific evidence. Long-held scientific consensus on issues like the causes and consequences of climate change or the importance of vaccines for public health is increasingly contested. A new initiative will examine the interplay between systematic misinformation campaigns, news coverage, and increasingly important social media platforms for public understanding of science and technological innovation.

  • Some real “bombshell news” in the Mueller investigation

    Former Trump team legal spokesperson Mark Corallo, in the summer of 2016, had concerns that White House communications director Hope Hicks may be considering obstructing justice after a comment she made in a conference call about emails between Donald Trump Jr. and Russians with ties to the Kremlin. “Mark Corallo is a pro’s pro who went to work for the Trump legal team completely on board and who wanted to help the president … well, make America great again. When he left after two months with some reports that he was troubled by what he was seeing … that was a deeply ominous sign,” Jim Geraghty writes in National Review. “If Corallo ends up offering sort of critical testimony, this is not because he’s a Judas or because he’s part of the establishment or some sort of ‘Deep State’ sellout. It’s because he saw stuff that genuinely struck him as either illegal or unethical or both and he’s not the kind of person who’s willing to lie under oath about it.”

  • Putin's postmodern war with the West; disinformation vaccination; firewalling democracy, and more

    · Putin’s postmodern war with the West

    · Firewalling democracy: Federal inaction on a national security priority

    · Twitter has notified at least 1.4 million users that they saw Russian propaganda during the election

    · The disinformation vaccination

    · Fear and loathing in Russia’s Catalonia: Moscow’s fight against federalism

    · What was Russia’s spy chief doing in Washington last week? Probably playing the Trump administration … again.

    · Keeping DOJ and FBI safe from a partisan president and Congress

    · Why the Russia probe demolished one lobbying firm but spared another

    · Electronic warfare trumps cyber for deterring Russia

  • Wanted: A firewall to protect U.S. elections

    As the FBI and Congress work to unravel Russia’s hacking of the 2016 presidential election and learn whether anyone in Donald Trump’s campaign supported the effort, one thing has become clear: U.S. elections are far more vulnerable to manipulation than was thought. A U.S. Department of Homeland Security warning and offer last year to help state election officials protect voter registration rolls, voting machines, and software from tampering was coolly received, perhaps out of skepticism or innate distrust of federal interference in a domain historically controlled by the states. Now, as federal and state officials are partnering to examine voting and election security, a new initiative at Harvard Kennedy School (HKS) is working to shore up another at-risk component of the U.S. election system: political campaigns.

  • Critical infrastructure firms face crackdown over poor cybersecurity

    An EU-wide cybersecurity law is due to come into force in May to ensure that organizations providing critical national infrastructure services have robust systems in place to withstand cyberattacks. The legislation will insist on a set of cybersecurity standards that adequately address events such as last year’s WannaCry ransomware attack, which crippled some ill-prepared NHS services across England. But, after a consultation process in the U.K. ended last autumn, the government had been silent until now on its implementation plans for the forthcoming law. A set of 14 guiding principles were drawn up, with the NCSC providing detailed advice including helpful links to existing cybersecurity standards. However, the cyber assessment framework, originally promised for release in January this year, won’t be published by the NCSC until late April – a matter of days before the NIS comes into force. Nonetheless, the NIS directive presents a good drive to improve standards for cybersecurity in essential services, and it is supported by sensible advice from the NCSC with more to come. It would be a shame if the positive aspects of this ended up obscured by hype and panic over fines.

  • Cyber incidents doubled in 2017

    The Online Trust Alliance (OTA) has just released its Cyber Incident & Breach Trends Report. OTA’s annual analysis found that cyber incidents targeting businesses nearly doubled from 82,000 in 2016 to 159,700 in 2017. Since the majority of cyber incidents are never reported, OTA believes the actual number in 2017 could easily exceed 350,000. The report analyzes data breaches, ransomware targeting businesses, business email compromise (BEC), distributed denial of service attacks (DDoS), and takeover of critical infrastructure and physical systems over the course of a year.

  • Hybrid warfare: Russia is “arch exponent” of the disappearing “distinct states of ‘peace’ and ‘war’”: U.K. military chief

    The West’s adversaries “have become masters at exploiting the seams between peace and war. What constitutes a weapon in this grey area no longer has to go ‘bang’. Energy, cash - as bribes - corrupt business practices, cyber-attacks, assassination, fake news, propaganda and indeed military intimidation are all examples of the weapons used to gain advantage in this era of ‘constant competition,’ and the rules-based international architecture that has assured our stability and prosperity since 1945 is, I suggest therefore, threatened,” Sir Nicholas Carter, the British Army chief of staff, said last week. “The deduction we should draw from this is that there is no longer two clear and distinct states of ‘peace’ and ‘war’; we now have several forms. Indeed the character of war and peace is different for each of the contexts in which these ‘weapon systems’ are applied,” he added. “The arch exponent of this [new approach to war] is Russia…. I believe it represents the most complex and capable state-based threat to our country since the end of the Cold War. And my fellow Chiefs of Staff from the United States, France, and Germany shared this view.”

  • Artificial intelligence is the weapon of the next Cold War

    As during the Cold War after the Second World War, nations are developing and building weapons based on advanced technology. During the Cold War, the weapon of choice was nuclear missiles; today it’s software, whether it is used for attacking computer systems or targets in the real world. Russian rhetoric about the importance of artificial intelligence is picking up – and with good reason: As artificial intelligence software develops, it will be able to make decisions based on more data, and more quickly, than humans can handle. As someone who researches the use of AI for applications as diverse as drones, self-driving vehicles and cybersecurity, I worry that the world may be entering – or perhaps already in – another cold war, fueled by AI. In a recent meeting at the Strategic Missile Academy near Moscow, Russian President Vladimir Putin suggested that AI may be the way Russia can rebalance the power shift created by the U.S. outspending Russia nearly 10-to-1 on defense each year. Russia’s state-sponsored RT media reported AI was “key to Russia beating [the] U.S. in defense.” With Russia embracing AI, other nations that don’t or those that restrict AI development risk becoming unable to compete – economically or militarily – with countries wielding developed AIs. Advanced AIs can create advantage for a nation’s businesses, not just its military, and those without AI may be severely disadvantaged. Perhaps most importantly, though, having sophisticated AIs in many countries could provide a deterrent against attacks, as happened with nuclear weapons during the Cold War.