• April Fools hoax stories may offer clues to help identify “fake news”

    Studying April Fools hoax news stories could offer clues to spotting ‘fake news’ articles, new research reveals. Researchers interested in deception have compared the language used within written April Fools hoaxes and fake news stories.

  • In disasters, Twitter users with large networks get out-tweeted

    New study shows that when it comes to sharing emergency information during natural disasters, timing is everything. The study on Twitter use during hurricanes, floods and tornadoes offers potentially life-saving data about how information is disseminated in emergency situations, and by whom. Unlikely heroes often emerge in disasters, and the same is true on social media.

  • British oversight body: Security flaws in Huawei 5G networks

    A British oversight board has slammed the Chinese telecom giant Huawei for software security flaws. The report, however, stopped short of blaming Chinese intelligence agencies for the engineering defects. The United States is concerned that Huawei is a front for the Chinese intelligence services, and that rolling out Huawei’s 5G system in Europe would open the door for Chinese spying or sabotage.

  • Why the next terror manifesto could be even harder to track

    Just before his shooting spree at two Christchurch, New Zealand mosques, the alleged mass murderer posted a hate-filled manifesto on several file-sharing sites. Soon, the widespread adoption of artificial intelligence on platforms and decentralized tools like IPFS will mean that the online hate landscape will change. Combating online extremism in the future may be less about “meme wars” and user-banning, or “de-platforming,” and could instead look like the attack-and-defend, cat-and-mouse technical one-upsmanship that has defined the cybersecurity industry since the 1980s. No matter what technical challenges come up, one fact never changes: The world will always need more good, smart people working to counter hate than there are promoting it.

  • Drones pose significant cyber, privacy challenges

    Growing drone use in populated areas poses significant risks that, without additional safeguards, could result in attacks by malicious entities and exploited for use in cyberattacks, terrorism, crime and invasion of privacy.

  • AG: Muller did not find that Trump’s campaign “conspired with the Russian government” 2016 election interference effort

    On Saturday afternoon, Attorney General William Barr sent Congress his “principal conclusions” of the Mueller report. Barr quotes the Mueller report to say that “[T]he investigation did not establish that members of the Trump Campaign conspired or coordinated with the Russian government in its election interference activities.” The Mueller report does not take a position on whether or not Trump engaged in obstruction of justice. Barr writes: “The Special Counsel… did not draw a conclusion — one way or the other — as to whether the examined conduct constituted obstruction.” The AG quotes the report to say that “while this report does not conclude that the President committed a crime, it also does not exonerate him.”

  • Social media create a spectacle society that makes it easier for terrorists to achieve notoriety

    The shocking mass-shooting in Christchurch last Friday is notable for using livestreaming video technology to broadcast horrific first-person footage of the shooting on social media. The use of social media technology and livestreaming marks the attack as different from many other terrorist incidents. It is a form of violent “performance crime.” That is, the video streaming is a central component of the violence itself, it’s not somehow incidental to the crime, or a disgusting trophy for the perpetrator to re-watch later. In an era of social media, which is driven in large part by spectacle, we all have a role to play in ensuring that terrorists aren’t rewarded for their crimes with our clicks.

  • Mega European project on cybersecurity and data protection

    A new European Commission cyber project aims to set international standards in cybersecurity and boost the effectiveness of Europe’s security capacities.

  • Russian trolls, bots spread false vaccine information on Twitter

    A study found that Russian trolls and bots have been spreading false information about vaccination, in support of the anti-vaccination movement. The false information was generated by propaganda and disinformation specialists at the Kremlin-affiliated, St. Petersburg-based IRA. The Kremlin employed IRA to conduct a broad social media disinformation campaign to sow discord and deepen divisions in the United States, and help Donald Trump win the 2016 presidential election.

  • Cyber toolkit for criminal investigations

    cybercrimes reached a six-year high in 2017, when more than 300,000 people in the United States fell victim to such crimes. Losses topped $1.2 billion. Cybercriminals can run, but they cannot hide from their digital fingerprints.

  • Dark web marketplace for SSL and TLS certificates

    A thriving marketplace for SSL and TLS certificates—small data files used to facilitate confidential communication between organizations’ servers and their clients’ computers—exists on a hidden part of the internet.

  • Studying how hate and extremism spread on social media

    The ADL and the Network Contagion Research Institute will partner to produce a series of reports that take an in-depth look into how extremism and hate spread on social media – and provide recommendations on how to combat both.

  • Four ways social media platforms could stop the spread of hateful content in aftermath of terror attacks

    Monitoring hateful content is always difficult and even the most advanced systems accidentally miss some. But during terrorist attacks the big platforms face particularly significant challenges. As research has shown, terrorist attacks precipitate huge spikes in online hate, overrunning platforms’ reporting systems. Lots of the people who upload and share this content also know how to deceive the platforms and get round their existing checks. So what can platforms do to take down extremist and hateful content immediately after terrorist attacks? I propose four special measures which are needed to specifically target the short term influx of hate.

  • Blocking digital gold diggers

    It is a phenomenon known to almost all of us: you browse the web and suddenly your computer slows down and runs loudly. This could be due to so-called crypto mining, meaning the access to computer power to generate cryptocurrencies without the knowledge of the user. New software, called “CoinEater,” blocks crypto mining.

  • Another Steel Dossier detail appears true

    On the final page of his 35-page dossier, former British intelligence officer Christopher Steele refers to a company, whose name is redacted, that allegedly was used to hack the Democratic party. Today, the New York Times identifies the company and its owner, Aleksej Gubarev, and says that according to a newly revealed report, the allegations against the Russian technology entrepreneur’s operations check out.