-
Researchers Release Open-Source Space Debris Model
With the escalating congestion in low Earth orbit, driven by a surge in satellite deployments, the risk of collisions and space debris proliferation is a pressing concern. Conducting thorough space environment studies is critical for developing effective strategies for fostering responsible and sustainable use of space resources. The MIT Orbital Capacity Assessment Tool lets users model the long-term future space environment.
-
-
Identifying Types of Cyberattacks That Manipulate Behavior of AI Systems
AI systems can malfunction when exposed to untrustworthy data – what is called “adversarial machine learning” — and attackers are exploiting this issue. New guidance documents the types of these attacks, along with mitigation approaches. No foolproof method exists as yet for protecting AI from misdirection, and AI developers and users should be wary of any who claim otherwise.
-
-
Electric vs. Gasoline Vehicles: Is EV Ownership Competitive in Your Area?
Is it actually cheaper to own an electric vehicle instead of a gas vehicle? It depends. Researchers say that where you live matters. Cumulative recurring costs for a midsize SUV across platforms—traditional gasoline, hybrid and electric—are higher in some cities when taking key factors into account: financing, annual fees, insurance, maintenance, repairs and fuel costs.
-
-
Revolutionizing Resource Renewal: Scaling Up Sustainable Recycling for Critical Materials
Permanent magnets, which retain magnetic properties even in the absence of an inducing field or current, are used extensively in clean energy and defense applications. Rare earths are challenging to access because they are scattered across Earth’s crust, yet they are key components in many modern technologies. Recycled rare earths can be used to make new permanent magnets, accelerate chemical reactions and improve the properties of metals when included as alloy components.
-
-
Speedier Security Screening in the Palm of the Hand
Though pat downs are currently an essential element of keeping travelers safe at the airport, it slows the screening process for people waiting in line and can be an uncomfortable experience for the passenger being screened. Reducing the need for pat downs may soon be easier.
-
-
Plagues, Cyborgs, and Supersoldiers: The Human Domain of War
How have advancements in biotechnology affected warfighting, and how could they do so in the future? Can the human body itself be a warfighting domain? Can the body itself be an offensive or defensive weapon?
-
-
Seven Moments in December that Changed Nuclear Energy History
December is a big month in the history of nuclear energy. From the first self-sustaining chain reaction to a pivotal breakthrough in nuclear fusion, some of the biggest events that laid the foundation for the nuclear energy sector all happened in the final month of the year.
-
-
Leveraging Artificial Intelligence in Explosives, Narcotic Detection
DHS S&T is applying emerging technologies in the development of artificial intelligence / machine learning technologies – and searching for ways to use these technologies to identify dangerous compounds, like those found in explosives and narcotics.
-
-
Enhancing Coastal Cities' Flood Resilience Through Smart City Technologies
In the face of climate change, a suite of advanced technologies can be integrated into urban design to reduce the flood risk posed by rising sea levels, more intense rainfall events, and more powerful storm surges.
-
-
Does AI Enable and Enhance Biorisks?
The diversity of the biorisk landscape highlights the need to clearly identify which scenarios and actors are of concern. It is important to consider AI-enhanced risk within the current biorisk landscape, in which both experts and non-experts can cause biological harm without the need for AI tools, thus highlighting the need for layered safeguards throughout the biorisk chain.
-
-
Generative AI and Weapons of Mass Destruction: Will AI Lead to Proliferation?
Large Language Models (LLMs) caught popular attention in 2023 through their ability to generate text based on prompts entered by the user. Ian J. Stewart writes that “some have raised concerns about the ability of LLMs to contribute to nuclear, chemical and biological weapons proliferation (CBRN). Put simply, could a person learn enough through an interaction with an LLM to produce a weapon? And if so, would this differ from what the individual could learn by scouring the internet?”
-
-
New Nuclear Deflection Simulations Advance Planetary Defense Against Asteroid Threats
As part of an effort to test different technologies to protect Earth from asteroids, a kinetic impactor was deliberately crashed into an asteroid to alter its trajectory. However, with limitations in the mass that can be lifted to space, scientists continue to explore nuclear deflection as a viable alternative to kinetic impact missions. Nuclear devices have the highest ratio of energy density per unit of mass of any human technology, making them an invaluable tool in mitigating asteroid threats.
-
-
Artificial Intelligence Systems Excel at Imitation, but Not Innovation
Artificial intelligence (AI) systems are often depicted as sentient agents poised to overshadow the human mind. But AI lacks the crucial human ability of innovation. While children and adults alike can solve problems by finding novel uses for everyday objects, AI systems often lack the ability to view tools in a new way.
-
-
“Energy Droughts” in Wind and Solar Can Last Nearly a Week, Research Shows
Understanding the risk of compound energy droughts—times when the sun doesn’t shine and the wind doesn’t blow—will help grid planners understand where energy storage is needed most.
-
-
Taking Illinois’ Center for Digital Agriculture into the Future
The Center for Digital Agriculture (CDA) at the University of Illinois Urbana-Champaign has a new executive director, John Reid, who plans to support CDA’s growth across all dimensions of use-inspired research, translation of research into practice, and education and workforce development.
-
More headlines
The long view
Autonomous Vehicle Technology Vulnerable to Road Object Spoofing and Vanishing Attacks
Researchers have demonstrated the potentially hazardous vulnerabilities associated with the technology called LiDAR, or Light Detection and Ranging, many autonomous vehicles use to navigate streets, roads and highways. The researchers have shown how to use lasers to fool LiDAR into “seeing” objects that are not present and missing those that are – deficiencies that can cause unwarranted and unsafe braking or collisions.
Tantalizing Method to Study Cyberdeterrence
Tantalus is unlike most war games because it is experimental instead of experiential — the immersive game differs by overlapping scientific rigor and quantitative assessment methods with the experimental sciences, and experimental war gaming provides insightful data for real-world cyberattacks.
Prototype Self-Service Screening System Unveiled
TSA and DHS S&T unveiled a prototype checkpoint technology, the self-service screening system, at Harry Reid International Airport (LAS) in Las Vegas, NV. The aim is to provide a near self-sufficient passenger screening process while enabling passengers to directly receive on-person alarm information and allow for the passenger self-resolution of those alarms.
Falling Space Debris: How High Is the Risk I'll Get Hit?
An International Space Station battery fell back to Earth and, luckily, splashed down harmlessly in the Atlantic. Should we have worried? Space debris reenters our atmosphere every week.
Testing Cutting-Edge Counter-Drone Technology
Drones have many positive applications, bad actors can use them for nefarious purposes. Two recent field demonstrations brought government, academia, and industry together to evaluate innovative counter-unmanned aircraft systems.
Strengthening the Grid’s ‘Backbone’ with Hydropower
Argonne-led studies investigate how hydropower could help add more clean energy to the grid, how it generates value as grids add more renewable energy, and how liner technology can improve hydropower efficiency.
The Tech Apocalypse Panic is Driven by AI Boosters, Military Tacticians, and Movies
From popular films like a War Games or The Terminator to a U.S. State Department-commissioned report on the security risk of weaponized AI, there has been a tremendous amount of hand wringing and nervousness about how so-called artificial intelligence might end up destroying the world. There is one easy way to avoid a lot of this and prevent a self-inflicted doomsday: don’t give computers the capability to launch devastating weapons.
The Tech Apocalypse Panic is Driven by AI Boosters, Military Tacticians, and Movies
From popular films like a War Games or The Terminator to a U.S. State Department-commissioned report on the security risk of weaponized AI, there has been a tremendous amount of hand wringing and nervousness about how so-called artificial intelligence might end up destroying the world. There is one easy way to avoid a lot of this and prevent a self-inflicted doomsday: don’t give computers the capability to launch devastating weapons.