• Flood Bot: New Flood Warning Sensors

    Ellicott City, Maryland, suffered devastating floods in 2016 and 2018. The disasters left residents and officials wondering how technology could help predict future severe weather, and save lives and property. Scientists offer an answer: The Flood Bot network.

  • New Nontoxic Ammunition

    Every time a gun fires, lead leaches into the air. A scientific advancement could provide a comparable replacement for lead-based explosive materials found in ammunition, protecting soldiers and the environment from potential toxic effects.

  • Using Frequency Analysis to Recognize Fake Images

    They look deceptively real, but they are made by computers: so-called deep-fake images are generated by machine learning algorithms, and humans are pretty much unable to distinguish them from real photos. New method makes it possible to expose fake images created by computer algorithms rather than by humans.

  • Securing the Smart Home

    So…you’ve built your smart home, it’s got smart heating and lighting, all the latest smart communications and entertainment systems, and of course, smart power generation to make it smart and green. But, how do you keep it secure and stop forced digital or physical entry? Well, you need smart security too, of course.

  • Quiet and Green: Why Hydrogen Planes Could Be the Future of Aviation

    Today, aviation is responsible for 3.6 percent of EU greenhouse gas emissions. Modern planes use kerosene as fuel, releasing harmful carbon dioxide into the atmosphere. But what if there was another way? One possible solution is to use a new type of fuel in planes that doesn’t produce harmful emissions – hydrogen. Long touted as a sustainable fuel, hydrogen is now gaining serious traction as a possibility for aviation, and already tests are under way to prove its effectiveness.

  • Accurately Pinpointing Malicious Drone Operators

    Researchers have determined how to pinpoint the location of a drone operator who may be operating maliciously or harmfully near airports or protected airspace by analyzing the flight path of the drone.

  • Privacy Risks of Home Security Cameras

    Researchers have used data from a major home Internet Protocol (IP) security camera provider to evaluate potential privacy risks for users. The researchers found that the traffic generated by the cameras could be monitored by attackers and used to predict when a house is occupied or not.

  • Improving Ethical Models for Autonomous Vehicles

    There’s a fairly large flaw in the way that programmers are currently addressing ethical concerns related to artificial intelligence (AI) and autonomous vehicles (AVs). Namely, existing approaches don’t account for the fact that people might try to use the AVs to do something bad.

  • Reverse Engineering of 3D-Printed Parts by Machine Learning Reveals Security Vulnerabilities

    Over the past thirty years, the use of glass- and carbon- fiber reinforced composites in aerospace and other high-performance applications has soared along with the broad industrial adoption of composite materials. Machine learning can make reverse engineering of complex composite material parts easy.

  • AI Could Help Solve the Privacy Problems It Has Created

    The stunning successes of artificial intelligence would not have happened without the availability of massive amounts of data, whether its smart speakers in the home or personalized book recommendations. These large databases are amassing a wide variety of information, some of it sensitive and personally identifiable. All that data in one place makes such databases tempting targets, ratcheting up the risk of privacy breaches. We believe that the relationship between AI and data privacy is more nuanced. The spread of AI raises a number of privacy concerns, most of which people may not even be aware. But in a twist, AI can also help mitigate many of these privacy problems.

  • How Much Control Would People Be Willing to Grant to a Personal Privacy Assistant?

    CyLab’s Jessica Colnago believes that in the future, the simple act of walking down the street is going to be a little weird. “You know how every time you enter a website, and it says: ‘We use cookies. Do you consent?’ Imagine that same thing walking down the street, but for a light pole, or a surveillance camera, or an energy sensor on a house,” Colnago says.

  • Sound Beacons Support Safer Tunnel Evacuation

    Research conducted as part of the project EvacSound demonstrates that auditory guidance using sound beacons is an effective aid during the evacuation of smoke-filled road tunnels. This is good news. It is a fact that vehicle drivers and passengers cannot normally expect to be rescued by the emergency services during such accidents.

  • Searching the Universe for Signs of Technological Civilizations

    Scientists are collaborating on a project to search the universe for signs of life via technosignatures. Researchers believe that although life appears in many forms, the scientific principles remain the same, and that the technosignatures identifiable on Earth will also be identifiable in some fashion outside of the solar system.

  • COVID-19 Sparks Technology Innovation

    Researchers say the swift development of wearable sensors tailored to a pandemic reinforces how a major crisis can accelerate innovation, Kane Farabaugh writes in VOA News. “I think it’s really opened people’s eyes to what’s possible, in terms of modern technology in that context,” said John Rogers of Northwestern University Technological Institute.

  • The Dangers of Tech-Driven Solutions to COVID-19

    Although few sensible people have anything good to say about the federal government response, reactions to tools for managing the pandemic designed by tech firms have been more mixed, with many concluding that such tools can minimize the privacy and human rights risks posed by tight coordination between governments and tech firms. Julie E. Cohen, Woodrow Hartzog, and Laura Moy write for Brookings that contact tracing done wrong threatens privacy and invites mission creep into adjacent fields, including policing. Government actors might (and do) distort and corrupt public-health messaging to serve their own interests. Automated policing and content control raise the prospect of a slide into authoritarianism.