• Drones pose significant cyber, privacy challenges

    Growing drone use in populated areas poses significant risks that, without additional safeguards, could result in attacks by malicious entities and exploited for use in cyberattacks, terrorism, crime and invasion of privacy.

  • AG: Muller did not find that Trump’s campaign “conspired with the Russian government” 2016 election interference effort

    On Saturday afternoon, Attorney General William Barr sent Congress his “principal conclusions” of the Mueller report. Barr quotes the Mueller report to say that “[T]he investigation did not establish that members of the Trump Campaign conspired or coordinated with the Russian government in its election interference activities.” The Mueller report does not take a position on whether or not Trump engaged in obstruction of justice. Barr writes: “The Special Counsel… did not draw a conclusion — one way or the other — as to whether the examined conduct constituted obstruction.” The AG quotes the report to say that “while this report does not conclude that the President committed a crime, it also does not exonerate him.”

  • Social media create a spectacle society that makes it easier for terrorists to achieve notoriety

    The shocking mass-shooting in Christchurch last Friday is notable for using livestreaming video technology to broadcast horrific first-person footage of the shooting on social media. The use of social media technology and livestreaming marks the attack as different from many other terrorist incidents. It is a form of violent “performance crime.” That is, the video streaming is a central component of the violence itself, it’s not somehow incidental to the crime, or a disgusting trophy for the perpetrator to re-watch later. In an era of social media, which is driven in large part by spectacle, we all have a role to play in ensuring that terrorists aren’t rewarded for their crimes with our clicks.

  • Mega European project on cybersecurity and data protection

    A new European Commission cyber project aims to set international standards in cybersecurity and boost the effectiveness of Europe’s security capacities.

  • Russian trolls, bots spread false vaccine information on Twitter

    A study found that Russian trolls and bots have been spreading false information about vaccination, in support of the anti-vaccination movement. The false information was generated by propaganda and disinformation specialists at the Kremlin-affiliated, St. Petersburg-based IRA. The Kremlin employed IRA to conduct a broad social media disinformation campaign to sow discord and deepen divisions in the United States, and help Donald Trump win the 2016 presidential election.

  • Cyber toolkit for criminal investigations

    cybercrimes reached a six-year high in 2017, when more than 300,000 people in the United States fell victim to such crimes. Losses topped $1.2 billion. Cybercriminals can run, but they cannot hide from their digital fingerprints.

  • Dark web marketplace for SSL and TLS certificates

    A thriving marketplace for SSL and TLS certificates—small data files used to facilitate confidential communication between organizations’ servers and their clients’ computers—exists on a hidden part of the internet.

  • Studying how hate and extremism spread on social media

    The ADL and the Network Contagion Research Institute will partner to produce a series of reports that take an in-depth look into how extremism and hate spread on social media – and provide recommendations on how to combat both.

  • Four ways social media platforms could stop the spread of hateful content in aftermath of terror attacks

    Monitoring hateful content is always difficult and even the most advanced systems accidentally miss some. But during terrorist attacks the big platforms face particularly significant challenges. As research has shown, terrorist attacks precipitate huge spikes in online hate, overrunning platforms’ reporting systems. Lots of the people who upload and share this content also know how to deceive the platforms and get round their existing checks. So what can platforms do to take down extremist and hateful content immediately after terrorist attacks? I propose four special measures which are needed to specifically target the short term influx of hate.

  • Blocking digital gold diggers

    It is a phenomenon known to almost all of us: you browse the web and suddenly your computer slows down and runs loudly. This could be due to so-called crypto mining, meaning the access to computer power to generate cryptocurrencies without the knowledge of the user. New software, called “CoinEater,” blocks crypto mining.

  • Another Steel Dossier detail appears true

    On the final page of his 35-page dossier, former British intelligence officer Christopher Steele refers to a company, whose name is redacted, that allegedly was used to hack the Democratic party. Today, the New York Times identifies the company and its owner, Aleksej Gubarev, and says that according to a newly revealed report, the allegations against the Russian technology entrepreneur’s operations check out.

  • Russia attempted 2018 interference, gearing up to infiltrate election systems in 2020

    Defense Department and Homeland Security officials warn Russia did try to interfere in the 2018 election, and the United States is not prepared for what foreign adversaries likely will launch in 2020. One official told lawmakers on the House Appropriations Homeland Security Subcommittee that “what keeps him up at night” is thinking about the new ways adversaries will attempt to infiltrate US election systems in 2020.

  • Trapdoor found in SwissVote election system

    Researchers have examined the source code published as part of the SwissPost e-voting system, provided by Scytl, and discovered a cryptographic trapdoor. If exploited, researchers say this could allow insiders who ran or implemented the election system to modify votes undetected.

  • Fraudulent news, disinformation become “new normal” political tactics

    New report warns of the risk of fraudulent news and online disinformation becoming a normalized part of U.S. political discourse. The report sounds an alarm that fraudulent news and online disinformation, which distort public discourse, erode faith in journalism, and skew voting decisions, are becoming part of the toolbox of hotly contested modern campaigns. 

  • Information literacy must be improved to stop spread of “fake news”

    It is not difficult to verify whether a new piece of information is accurate; however, most people do not take that step before sharing it on social media, regardless of age, social class or gender, a new study has found.