NUCLEAR WEAPONSNuclear Engineer Uses Machine Learning on Weapons Testing Images to Understand Fallout
After WWII, the U.S. wanted to better understand what happened after a nuclear weapon was detonated. Researchers conducted tests in the southwestern U.S. and the Pacific Ocean and recorded those experiments on film. Scientists used the original reel-to-reel films to manually measure data from the blasts. Today, nuclear forensic scientists combine modern computational techniques with the historical records of nuclear tests to obtain precious insights into the physics of these type of events, which are otherwise hard to study experimentally.
Cody Lloyd became a nuclear engineer because of his interest in the Manhattan Project, the United States’ mission to advance nuclear science to end World War II. As a research associate in nuclear forensics at the Department of Energy’s Oak Ridge National Laboratory, Lloyd now teaches computers to interpret data from imagery of nuclear weapons tests from the 1950s and early 1960s, bringing his childhood fascination into his career.
After WWII, the U.S. wanted to better understand what happened after a nuclear weapon was detonated. Researchers conducted tests in the southwestern U.S. and the Pacific Ocean and recorded those experiments on film. Scientists used the original reel-to-reel films to manually measure data from the blasts.The films were kept over the years at Los Alamos National Laboratory until a recent project — under the direction of Greg Spriggs at Lawrence Livermore National Laboratory in collaboration with LANL — turned the films into high-resolution digital images.
As a nuclear forensic scientist, Lloyd is combining modern computational techniques with the historical records of nuclear tests to obtain precious insights into the physics of these type of events, which are otherwise hard to study experimentally. He is using machine learning algorithms to automatically extract data from the blast imagery. After some training, the algorithms can take a few frames of a video as an input and generate the information he needs.
“The computer can detect motion from frame to frame in the film to show the physics of how the cloud grows and moves through the air moments after detonation,” Lloyd said. “Scientists can use this information to update current cloud rise and atmospheric transport and dispersion models for how a contaminant may behave if it were released near a community.”
Researchers across the DOE labs train machine learning models with measurements taken manually from the films. Measurements are compared with data from years ago and give researchers more insight into how algorithms interpret the fluid dynamics of blast plumes. Instead of having a researcher go through each film frame by frame, these algorithms can scan a set of images and rapidly find the information of interest. For example, as the cloud rises, the algorithms can quickly measure distance and height as the shape morphs over time.
Teaching machine learning algorithms requires training data, which shows the computer what type of information it needs to identify to answer a specific question. Typically, researchers expose the computer to a cultivated set of images before showing it an independent data set to test the accuracy of its predictions. The historical atmospheric test films represent the only source of such data, so they are used for both training and testing. Lloyd selected frames that provided enough information to the computer but were separate from the images the computer needed to answer the research question.
One challenge of this project is the quality of the films. Though the files are high resolution, videos from seven decades ago may not have the best features. Unlike the digital imagery we see today, imagery from the atmospheric testing era is on physical film and subject to additional physical imperfections and effects. Most of the films are in black and white with significant contrast between the background and the blast. Photographers often put filters on the cameras so the background would show much darker than in reality. Sometimes a fireball appears as a bright ring with a dark center, like a solarization effect, but the fireball is bright throughout in real life. Some frames are overexposed from excessive bright light seeping through the cracks of the camera.
Lloyd added other historical data to the algorithms to get more accurate results, such as location of the cameras in relation to the blast. He accessed meteorological data to learn more about atmospheric conditions impacting how the cloud grew and moved. By providing more data, the algorithms can produce better results.
“One thing that’s been surprising,” said Lloyd, “is how successfully off-the-shelf MATLAB and Python machine learning models have performed.” As he looks to future possibilities for analyzing this footage, he is interested in building his own algorithms to enhance feature extraction.
The future is something Lloyd is excited about. He said there are endless possibilities for what the films can be used for, since open-air testing of nuclear weapons is something that happened decades ago and will not likely happen again. “Even though this data is old, it’s still highly valuable to understand fallout of material that’s released into the air — where it goes and what it looks like when it falls to the ground.”