Future of nuclear energy unclear

Many realize that for nuclear energy production to have a future, the entire industry needs an overhaul — including how regulatory structures and energy markets are constructed, as well as how nuclear reactors are designed, financed and built. The need for industry-wide modernization is clear even in highly partisan Washington, D.C., where lawmakers from both sides of the aisle are largely in agreement that the nuclear sector — one of the most heavily regulated industries in the world — needs to be more accommodating to new ventures.

Likewise, training a new nuclear workforce will also need an overhaul. That is why, with a sense of urgency and favorable political tailwinds, Slaybaugh launched a nuclear innovation bootcamp. Held in August, the two-week bootcamp hosted 25 university students from around the world and encouraged them to envision what “new nuclear” would look like. Slaybaugh collaborated with Third Way, a D.C.-based centrist think tank working on nuclear energy-related issues, along with the Nuclear Innovation Alliance industrial consortium, to develop the curriculum for the two-week course.

“One of the reasons it makes sense to have this bootcamp at Berkeley,” says Todd Allen, a nuclear energy expert and senior visiting fellow at Third Way, “is because there is a culture of innovation. One of the Department of Energy’s first incubators, Cyclotron Road, is located at the Berkeley Lab. The Bay Area has all of the pieces that could support something like this.”

The atomic age
The golden age of nuclear began immediately following the Second World War, when the federal government started pouring research and development money into commercial nuclear reactor designs.

In 1951, in a concrete building nestled in the sagebrush scrub plains of eastern Idaho, scientists working at the National Reactor Testing Station (now part of the Idaho National Laboratory) flipped the switch on the first reactor designed to convert heat derived from splitting uranium atoms into electricity. During its first flickers of life, the reactor lit up four 200-watt lightbulbs, kicking off a decade of pioneering research and engineering — followed by four decades of controversy and catastrophic technological failures.

By the late 1950s, the first large-scale commercial nuclear reactors came online across the country. In 1960, the Atomic Energy Commission estimated that the nation would be powered by thousands of nuclear reactors by the year 2000.

“Back in the day, the philosophy was that commercial deployment had to be done as quickly as possible,” says Per Peterson, nuclear engineering professor and the college’s executive associate dean. “We became competent in building and operating water-cooled reactors for submarines. And then we got locked into that one kind of technology.”

Despite early developments using other reactor designs and fuel configurations, the industry settled on that single design — water-cooled reactors, also known as light-water reactors — as a universal standard. The time and money involved in the nuclear regulatory permitting process made deviating from the accepted design prohibitively expensive.

Light-water reactors produce electricity by creating steam to spin a turbine. The solid fuel, usually uranium arranged in rods that need replacing roughly every four years, is cooled by pressurized water. An accident at a light-water reactor can release radioactive materials as fine particles. With high pressure steam, these particles can leak from a reactor building, as in the high-profile accidents at Chernobyl and Fukushima.

“The consequence space for severe accidents is pretty substantial with this type of reactor,” Peterson says. “Therefore, it took a lot of effort to develop extremely reliable active systems to provide cooling, low leakage and high-pressure containment structures, which make these reactors more expensive. So they were built bigger and bigger to achieve economies of scale.”

“In the end, that didn’t seem to work too well,” he says.

In 1979, a reactor at Three Mile Island in Pennsylvania had a partial meltdown because of valve failure and human operator error, resulting in the evacuation of 140,000 people. Following the accident, anti-nuclear sentiments became a foundation of the country’s budding environmental movement, raising questions about the safety of nuclear facilities and what to do with the growing pile of spent nuclear fuel rods.

Over the next 30 years, the vision from nuclear’s early days — of thousands of reactors pumping out emissions-free energy — was tempered by economics and politics.

A design problem
Despite the grim outlook for growth, Slaybaugh became curious about a career in nuclear engineering as an undergraduate at Penn State in the early 2000s. She was initially interested in physics when she happened to get a work-study assignment at the university’s research reactor.

In graduate school at the University of Wisconsin, she began studying the Boltzmann Transport Equation — “a single equation that describes where all of the neutrons are in a nuclear system,” Slaybaugh explains. “Anything in a nuclear system starts with where all of the neutrons are, so it lets you figure out everything else.”

Working with the equation can be challenging, so Slaybaugh developed expertise in creating algorithms and software to solve the equation faster and more efficiently, which ultimately can be applied to designing and modeling new nuclear technologies.

“Truly predictive modeling will end up making it a lot more feasible, affordable and practical to ask questions about what’s going to happen in new reactor design scenarios,” Slaybaugh says. “I also have this serious concern about best practices and quality: You want to make sure that the codes you are using in nuclear systems work.”

“Fundamentally,” Slaybaugh says, “I make the tools that other people use to do analysis. So I get really excited about making better hammers so that other people can make better houses.” Slaybaugh, recently appointed by the Secretary of Energy to the Nuclear Energy Advisory Committee, also works with the Gateway for Accelerated Innovation in Nuclear (GAIN), a group organized by the Department of Energy to provide guidance on technical, regulatory and financial issues facing this emerging “advanced nuclear” industry.

Advanced nuclear is the umbrella term used to describe novel research on smaller reactor designs that incorporate alternative nuclear fuels and cooling systems. Some advanced designs reuse existing nuclear waste as fuel; or use fuel that does not require enrichment, which reduces security concerns associated with nuclear energy.

“The big thing is that the government is making national lab resources available to private companies in a way that it wasn’t before,” Slaybaugh says. “If you are a nuclear startup, you can only go so far before you need to do testing, and you are not going to build a nuclear test facility, because that is hard and expensive. But now you could partner with a national lab to use their experimental resources. I’ve been talking about how to set up a pathway from universities for this kind of research.”

Over the past year, Third Way, a supporter of Slaybaugh’s nuclear innovation bootcamp, published a number of reports and white papers defining the advanced nuclear industry. They found 48 projects and startup companies working on advanced nuclear energy technologies, worth over $1.3 billion, all over the U.S. and Canada.

One of those projects is led by Per Peterson’s research group at Berkeley. Following his Ph.D. research in mechanical engineering at Berkeley, Peterson began designing passive safety systems for light-water reactors, with an eye toward replacing and greatly simplifying the active safety systems the industry had originally adopted.

“Back in 2002,” he says, “the U.S. launched an international effort on advanced nuclear technologies called Generation IV. This got us thinking about what we wanted to see in advanced nuclear technologies, beyond just passive safety.”

Those experiences led Peterson to conceptualize entirely new designs. “Now the majority of my research relates to advanced reactors cooled by molten fluoride salts, which have undergone major advances since molten salts were first studied for reactor applications starting in the late 1950s,” he says.  

Molten-salt reactors are cooled by fluoride salts that liquefy and remain stable at high temperatures. They do not need to be pressurized like light-water reactors do, reducing the probability of large-scale accidents.

“Molten salts are fantastic heat-transfer fluids; they have enormous volumetric heat capacity, which means they are remarkably compact. This puts you in a position to design reactor vessels to have limited service life, to be replaced multiple times during a life of a plant,” Peterson says. “As soon as you focus on limited service life, you are in a very different space in terms of innovation and upgrading old components.”

UC Berkeley notes that Peterson, named to the Department of Energy’s Blue-Ribbon Commission on America’s Nuclear Future in 2010, also contributes to the national discussion about new nuclear regulatory standards. “Here we are just 10 years after NASA launched its Commercial Orbital Transport Services program to fund startup companies like SpaceX, and massive change has occurred with the idea that private-sector startup companies can be significantly more nimble and still work in areas requiring high levels of technical sophistication.”

Drawing inspiration from successes from other heavily regulated industries, Peterson says, is what keeps him optimistic. “There is the potential for rapid innovation to occur, and we can make major changes in nuclear technology. This is what we need to be working on this coming decade.”