• Preparing for an Explosive Attack

    Explosives are a popular choice among terrorists for causing disruption, casualties and destruction. Although chemical, biological, radiological and nuclear (CBRN) weapons may cause much more damage, explosives can still be the first choice because they are relatively easy to make, transport and use. DHS S&T says it wants to make sure that state and local leaders have choices, too, by arming them with technology to plan for worst-case scenarios and mitigate the fallout of terrorist attacks.

  • No Laughing Matter: Laughter Signature as New Biometrics

    The popular view of biometric security often invokes fingerprint readers, iris or retinal scans, and voice-activated systems. Researchers have now demonstrated how the way a person laughs might be used in biometrics. Initial tests of the approach show that a prototype laughter recognition algorithm can be 90 percent accurate.

  • How French Technology Can Control Wearing of a Mandatory Mask

    The French government announced that as of Monday, wearing a face mask in enclosed public places will become mandatory. How would it be possible to check whether thousands of people are following the government’s instructions or not? Several French start-ups have developed solutions which are now being tested. Valentin Hamon-Beugin writes in Le Figaro [in French] that some companies have developed tools which rely on the use of CCTVs. Software is installed in the cameras, and using artificial intelligence, it detects masked faces. “It’s not about facial recognition. We simply recognize the human form behind the mask, but we don’t have access to the identity of the people filmed,”explains Virginie Ducable, project manager at RedLab, a Normandy-based start-up. No image is stored on servers, only statistical data is sent to the client. “These statistics can serve them in a concrete way. For example, if they find that too few people are wearing a mask at any given time, they will be able to automatically launch voice announcements urging them to follow health guidelines,” she adds. Olivier Gualdoni, CEO of Drone Volt, whose subsidiary, Aérialtronics, is working on a similar project, “Our solution aims to prevent, not to punish. We are completely opposite of the repression stereotypes associated with artificial intelligence.”

  • With Coronavirus Antibodies Fading Fast, Vaccine Hopes Fade, Too

    Disturbing new revelations that permanent immunity to the coronavirus may not be possible have jeopardized vaccine development and reinforced a decision by scientists at UCSF and affiliated laboratories to focus exclusively on treatments. Peter Fimrite writes in the San Francisco Chronicle that several recent studies conducted around the world indicate that the human body does not retain the antibodies that build up during infections, meaning there may be no lasting immunity to COVID-19 after people recover. Strong antibodies are also crucial in the development of vaccines. So molecular biologists fear the only way left to control the disease may be to treat the symptoms after people are infected to prevent the most debilitating effects, including inflammation, blood clots and death.

     

  • “Threshold Cryptography” Bolsters Protection of Sensitive Data

    A new publication by NIST cryptography experts proposes the direction the technical agency will take to develop a more secure approach to encryption. This approach, called threshold cryptography, could overcome some of the limitations of conventional methods for protecting sensitive transactions and data.

  • Showcasing Cybersecurity Technologies

    Twelve innovative cybersecurity technologies available for commercial licensing from four U.S. Department of Energy national laboratories will be showcased to the public during a series of free webinars starting this month.

  • Flood Bot: New Flood Warning Sensors

    Ellicott City, Maryland, suffered devastating floods in 2016 and 2018. The disasters left residents and officials wondering how technology could help predict future severe weather, and save lives and property. Scientists offer an answer: The Flood Bot network.

  • New Nontoxic Ammunition

    Every time a gun fires, lead leaches into the air. A scientific advancement could provide a comparable replacement for lead-based explosive materials found in ammunition, protecting soldiers and the environment from potential toxic effects.

  • Using Frequency Analysis to Recognize Fake Images

    They look deceptively real, but they are made by computers: so-called deep-fake images are generated by machine learning algorithms, and humans are pretty much unable to distinguish them from real photos. New method makes it possible to expose fake images created by computer algorithms rather than by humans.

  • Securing the Smart Home

    So…you’ve built your smart home, it’s got smart heating and lighting, all the latest smart communications and entertainment systems, and of course, smart power generation to make it smart and green. But, how do you keep it secure and stop forced digital or physical entry? Well, you need smart security too, of course.

  • Quiet and Green: Why Hydrogen Planes Could Be the Future of Aviation

    Today, aviation is responsible for 3.6 percent of EU greenhouse gas emissions. Modern planes use kerosene as fuel, releasing harmful carbon dioxide into the atmosphere. But what if there was another way? One possible solution is to use a new type of fuel in planes that doesn’t produce harmful emissions – hydrogen. Long touted as a sustainable fuel, hydrogen is now gaining serious traction as a possibility for aviation, and already tests are under way to prove its effectiveness.

  • Accurately Pinpointing Malicious Drone Operators

    Researchers have determined how to pinpoint the location of a drone operator who may be operating maliciously or harmfully near airports or protected airspace by analyzing the flight path of the drone.

  • Privacy Risks of Home Security Cameras

    Researchers have used data from a major home Internet Protocol (IP) security camera provider to evaluate potential privacy risks for users. The researchers found that the traffic generated by the cameras could be monitored by attackers and used to predict when a house is occupied or not.

  • Improving Ethical Models for Autonomous Vehicles

    There’s a fairly large flaw in the way that programmers are currently addressing ethical concerns related to artificial intelligence (AI) and autonomous vehicles (AVs). Namely, existing approaches don’t account for the fact that people might try to use the AVs to do something bad.

  • Reverse Engineering of 3D-Printed Parts by Machine Learning Reveals Security Vulnerabilities

    Over the past thirty years, the use of glass- and carbon- fiber reinforced composites in aerospace and other high-performance applications has soared along with the broad industrial adoption of composite materials. Machine learning can make reverse engineering of complex composite material parts easy.