• Critical Vulnerabilities Found within Major LLMs

    Large Language Models (LLMs) such as ChatGPT and Bard have taken the world by storm this year, with companies investing millions to develop these AI tools, and some leading AI chatbots being valued in the billions. Computer scientists have demonstrated that chunks of these LLMs can be copied in less than a week for as little as $50, and the information gained can be used to launch targeted attacks.

  • Responsible AI Initiative Seeks to Solve Societal Problems

    By Amy Choate-Nielsen

    With a $100 million investment, a new research initiative focuses on artificial intelligence (AI) that aims to responsibly use advanced AI technology to tackle societal issues.

  • Fueling the Future of Fusion Energy

    Long considered the ultimate source of clean energy, nuclear fusion promises abundant electrical power without greenhouse gas emissions or long-lasting radioactive waste. The process has fueled the core of the sun for more than four billion years – with billons more to go. Nore scientists are joining the global pursuit of harnessing that reaction.

  • Modular Dam Design Could Accelerate the Adoption of Renewable Energy

    Researchers have developed a new modular steel buttress dam system designed to resolve energy storage issues hindering the integration of renewable resources into the energy mix. The m-Presa modular steel buttress dam system cut dam construction costs by one-third and reduce construction schedules by half.

  • Not the Usual Suspects: New Interactive Lineup Boosts Eyewitness Accuracy

    Allowing eyewitnesses to dynamically explore digital faces using a new interactive procedure can significantly improve identification accuracy compared to the video lineup and photo array procedures used by police worldwide.

  • Securing the Food Pipeline from Cyberattacks

    By Jesenia Hernandez

    Sensors detecting the amount of food that herds of cattle are eating. Machines taking thousands of photos of fruit per second to detect their defects and sort them by quality. Robots packing fruit and vegetables into bags and boxes for purchase at grocery stores: Researchers are protecting the food and agriculture sector.

  • Scorpius Images to Test Nuclear Stockpile Simulations

    One thousand feet below the ground, three national defense labs and a remote test site are building Scorpius — a machine as long as a football field — to create images of plutonium as it is compressed with high explosives, creating conditions that exist just prior to a nuclear explosion. The Sandia injector is key to validating plutonium pit performance.

  • Chi-Nu Experiment Concludes with Data to Support Nuclear Security, Energy Reactors

    The Chi-Nu project, a years-long experiment measuring the energy spectrum of neutrons emitted from neutron-induced fission, recently concluded the most detailed and extensive uncertainty analysis of the three major actinide elements — uranium-238, uranium-235 and plutonium-239.

  • AI-Driven Earthquake Forecasting Shows Promise in Trials

    A new attempt to predict earthquakes with the aid of artificial intelligence has raised hopes that the technology could one day be used to limit earthquakes’ impact on lives and economies. Researchers used AI algorithm to correctly predict 70% of earthquakes a week before they happened during a seven-month trial in China.

  • Computer Scientists Awarded $3M to Bolster Cybersecurity

    By Louis DiPietro

    A $3 million grant from the DARPA, the research and development arm of the U.S. Department of Defense, aims to leverage reinforcement learning to make computer networks stronger, dynamic and more secure.

  • Using Petroleum Reservoirs to Store Carbon

    Oil and gas produced from reservoirs are traditionally thought of as sources of carbon dioxide and other greenhouse gases. In recent years, scientists in government and industry have been looking more at oil and gas reservoirs as places to store the very carbon that was previously taken out of the reservoirs. Injecting carbon dioxide into oil reservoirs also increases oil production in areas that have already produced a lot of oil. 

  • NSF Backs Processor Design, Chip Security Research

    Rice University computer scientists have won two grants from the National Science Foundation to explore new information processing technologies and applications that combine seamlessly co-designed hardware and software to allow for more effective and efficient data stream analysis using pattern matching.

  • AI Risks to the Financial Sector

    In a world where AI algorithms can already analyze real-time financial information and make high-stakes trading decisions with little or no human oversight, our financial regulations are failing to keep up. A professor of computer science and engineering identifies new concerns that recent AI advances pose for financial markets.

  • AI Disinformation Is a Threat to Elections − Learning to Spot Russian, Chinese and Iranian Meddling in Other Countries Can Help the U.S. Prepare for 2024

    By Bruce Schneier

    Elections around the world are facing an evolving threat from foreign actors, one that involves artificial intelligence. Countries trying to influence each other’s elections entered a new era in 2016, when the Russians launched a series of social media disinformation campaigns targeting the U.S. presidential election. But there is a new element: generative AI and large language models. These have the ability to quickly and easily produce endless reams of text on any topic in any tone from any perspective, thus making generative AI and large language models a tool uniquely suited to internet-era propaganda. The sooner we know what to expect, the better we can deal with what comes.

  • U.S.-China “Tech War”: AI Sparks First Battle in Middle East

    By Cathrin Schaer

    The U.S. has restricted exports of some computer chips to the Middle East, to stop AI-enabling chips from getting to China. But there’s no information on which countries are affected, or how chips would get to China. What is becoming clear is that AI could well become a new source of friction between democratic and autocratic states.