• Sea level rise requires new forms of decision making

    U.S, cities facing sea level rise need to look beyond traditional strategies for managing issues such as critical erosion and coastal squeeze, according to new research. Civil society initiatives must now play a crucial role in adapting society to climate change, and decision makers must seriously consider the tradeoff among three options: sea wall; beach-nourishment; and relocating coastal infrastructure.

  • Using artificial intelligence to predict criminal aircraft

    The ability to forecast criminal activity has been explored to various lengths in science fiction, but does it hold true in reality? It could for U.S. Customs and Border Protection (CBP). ) DHS S&T is developing a Predictive Threat Model (PTM) to help CBP’s Air and Marine Operations Center (AMOC) more quickly and efficiently identify and stop nefarious aircraft.

  • New framework for guiding controversial research still has worrisome gaps

    In December the Department of Health and Human Services (HHS) release lifted the funding moratorium on Gain of Function (GoF) research, following the controversial projects involving H5N1 in 2011. The “Framework for guiding funding decisions about proposed research involving enhanced potential pandemic pathogens” is similar to the January 2017 “P3C0 Framework,” and it came with the bonus of restoring funding for such research – but there are still considerable concerns with how GoF research is evaluated and if these frameworks have really addressed the gaps.

  • Bioengineers today emphasize the crucial ingredient Dr. Frankenstein forgot – responsibility

    Mary Shelley was 20 when she published “Frankenstein” in 1818. Two hundred years on, the book remains thrilling, challenging and relevant — especially for scientists like me whose research involves tinkering with the stuff of life. Talk of “engineering biology” makes a lot people squeamish, and technology can turn monstrous, but I read Mary Shelley’s “Frankenstein” not as an injunction against bioengineering as such. Rather, the story reveals what can happen when we – scientists and nonscientists alike – run away from the responsibilities that science and technology demand. Victor Frankenstein was certainly careless and perhaps a coward, unable to own up to the responsibility of what he was doing. We now know that science is best conducted with humility, forethought and in the light of day.

  • Extreme weather tests U.K. gas security to the limit

    The National Grid, which manages the U.K.’s energy network, warned that it might not have enough gas to meet demand on March 1, due to plummeting temperatures and issues with supply. It has since withdrawn the warning, saying the market response has boosted supplies. But Britain’s lack of flexible energy supply is a serious issue. This isn’t the first time such a warning has been issued and it probably won’t be the last.

  • If you want to know how to stop school shootings, ask the Secret Service

    While President Donald Trump has not shied away from offering suggestions on how to prevent school shootings – including one controversial idea to arm teachers – what often gets overlooked in the conversation is research on the subject that has already been done. This research includes one major study of school shootings conducted in part by the very agency charged with protecting the president of the United States himself - the U.S. Secret Service. Has this research been ignored or just forgotten?

  • Flood risk for Americans is greatly underestimated

    A new study has found that forty-one million Americans are at risk from flooding rivers, which is more than three times the current estimate—based on regulatory flood maps—of thirteen million people. The study is based on a new high-resolution model that maps flood risk across the entire continental United States, whereas the existing regulatory flood maps produced by the Federal Emergency Management Agency (FEMA) cover about 60 percent of the continental United States. Avoiding future losses is particularly important as average flood losses in the United States have increased steadily to nearly $10 billion annually.

  • Protecting soldiers from blast-induced brain injury

    Researchers have developed a new military vehicle shock absorbing device that may protect warfighters against traumatic brain injury (TBI) due to exposure to blasts caused by land mines. During Operations Iraqi Freedom and Enduring Freedom, more than 250,000 warfighters were victims of such injuries. Prior to this study, most research on blast-induced TBI has focused on the effects of rapid changes in barometric pressure, also known as overpressure, on unmounted warfighters.

  • U.S. firefighters and police turn to an Israeli app to save lives

    When Hurricane Irma hit the Florida Keys in September 2017, the new First Response app from Israeli-American company Edgybees helped first-responders identify distress calls in flooded areas. When wildfires hit Northern California a month later, the app steered firefighters away from danger. This lifesaving augmented-reality app — designed only months before as an AR racing game for drone enthusiasts — is now used by more than a dozen fire and police departments in the United States, as well as the United Hatzalah emergency response network in Israel.

  • Anti-Semitic incidents surged nearly 60% in 2017: ADL report

    The Anti-Defamation League (ADL) said in a new report today that the number of anti-Semitic incidents was nearly 60 percent higher in 2017 than 2016, the largest single-year increase on record and the second highest number reported since ADL started tracking incident data in the 1970s. The sharp rise was in part due to a significant increase in incidents in schools and on college campuses, which nearly doubled for the second year in a row.

  • Why Trump’s idea to arm teachers may miss the mark

    President Donald Trump’s proposal to arm teachers has sparked substantial public debate. As researchers of consumer culture and lead authors of a recent study of how Americans use and view firearms for self-defense, we argue that while carrying a gun may reduce the risk of being powerless during an attack, it also introduces substantial and overlooked risks to the carrier and others. Despite the widespread news coverage of mass shootings at schools, the reality is that school shootings are still a rare occurrence. In an FBI study of 160 active shooter incidents that FBI identified between 2000 and 2013, 27 – or about 17 percent – occurred at elementary, middle, and high schools. Given that rarity, the challenges of effectively using a gun to neutralize a shooter without taking additional lives, and added day-to-day risks, we argue that Trump’s proposal would not be effective in making schools safer overall for teachers or students.

  • Researchers join AI-enabled robots in “collaborative autonomy”

    A team of firefighters clears a building in a blazing inferno, searching rooms for people trapped inside or hotspots that must be extinguished. Except this isn’t your typical crew. Most apparent is the fact that the firefighters are not all human. They are working side-by-side with artificially intelligent (AI) robots who are searching the most dangerous rooms, and making life or death decisions. This scenario is potentially closer than you might think, but while AI-equipped robots might be technologically capable of rendering aid, sensing danger or providing protection for their flesh-and-blood counterparts, the only way they can be valuable to humans is if their operators are not burdened with the task of guiding them.

  • Global AI experts warn of malicious use of AI in the coming decade

    Twenty-six experts on the security implications of emerging technologies have jointly authored an important new report, sounding the alarm about the potential malicious use of artificial intelligence (AI) by rogue states, criminals, and terrorists. Forecasting rapid growth in cyber-crime and the misuse of drones during the next decade – as well as an unprecedented rise in the use of “bots” to manipulate everything from elections to the news agenda and social media. the report calls for governments and corporations worldwide to address the clear and present danger inherent in the myriad applications of AI.

  • Deep Fakes: A looming crisis for national security, democracy and privacy?

    Events in the last few years, such as Russia’s broad disinformation campaign to undermine Western democracies, including the American democratic system, have offered a compelling demonstration of truth decay: how false claims — even preposterous ones — can be disseminated with unprecedented effectiveness today thanks to a combination of social media ubiquitous presence and virality, cognitive biases, filter bubbles, and group polarization. Robert Chesney and Danielle Citron write in Lawfare that the resulting harms are significant for individuals, businesses, and democracy – but that the problem may soon take a significant turn for the worse thanks to deep fakes. They urge us to get used to hearing that phrase. “It refers to digital manipulation of sound, images, or video to impersonate someone or make it appear that a person did something—and to do so in a manner that is increasingly realistic, to the point that the unaided observer cannot detect the fake. Think of it as a destructive variation of the Turing test: imitation designed to mislead and deceive rather than to emulate and iterate.”

  • Analytical methods to help develop antidotes for cyanide, mustard gas

    Several Food and Drug Administration-approved antidotes are available for cyanide poisoning, but they have severe limitations. To develop effective antidotes for chemical agents, such as cyanide and mustard gas, scientists need analytical methods that track not only the level of exposure but also how the drug counteracts the effects of the chemical.