Oct 10, 2019

Ep 204: Tom Evans - Distinguished Research & Development Staff, ORNL

Distinguished Research & Development Staff
,
Oak Ridge National Laboratory
Play audio
0:00
0:00
http://traffic.libsyn.com/whynotnuclear/Tom_Evans_Podcast.mp3

Show notes

Tom’s journey towards model simulation (0:38)
0:38-9:37 (Tom describes the journey he took to end up modeling simulations at Oak Ridge National Laboratory.)

Q. Are you from Tennessee originally?
(0:54) A. Tom Evans grew up in Pennsylvania. He went to school at Haverford College in Philadelphia before attending Georgia Tech for graduate school. From there, Tom moved to New Mexico to work at Los Alamos National Laboratories, where he worked for 10 years. He is now Research & Development Staff at Oak Ridge National Laboratory.

As an undergraduate, Tom was a physics and astronomy double major. He then planned to study mechanical engineering in graduate school, but became involved in the nuclear engineering department. His PhD focused on radiation detection and simulated responses. In Los Alamos, Tom focused on computational radiation dynamics, which involves the processes that power stars. When Tom joined the lab, simulations took the place of physical tests, increasing the development of supercomputers. At Oak Ridge, Tom works more on the nuclear engineering side of things where he leads a team working on modeling simulation codes that are used to help design nuclear reactors.

Single to coupled physics simulations (9:38)
9:38-14:49 (Tom describes how an increase in computational power have improved simulations. This improvement, however, comes with challenges of complexity.)

Q. Do you ever look back and wonder how we were able to make decisions based on the old way of modeling?
(10:04) A. Tom believes that the modeling methods used now were originally thought of in the past, but we did not have the means to develop them back then. Despite this, researchers were still able to create ways to approximate effects from low resolution models. Now, we have more computing power, meaning researchers can do more direct simulation of physical models.

This does, however, come with challenges. Physics traditionally had theorists and experimentalists. Computation has since grown out of the theory side and developed into its own branch of physics. When computational physics first began, people focused on single physics simulations that only looked at one thing. Now, computational physics models multiple things at once, becoming known as coupled physics simulations. This complexity means it’s harder to solve bigger problems.

The Exascale Computing Project (14:50)
14:50-20:08 (Tom describes the Exascale Computing Project and the increased computational power of the new Frontier supercomputer.)

Q. What are some of the big projects that you are working on right now?
(15:05) A. Tom’s team is currently focused on the Department of Energy’s (DoE) Exascale Computing Project. This is a 7 year $2 billion project that includes 17 national laboratories and many universities to develop the next generation of supercomputers. In 2023, Oak Ridge will launch the Frontier supercomputer, which will be capable of performing 1.4 exaflops of calculation. An exaflop can perform 1015 floating point operations per second. The more calculations a computer can solve in one second, the more computational power it has. Currently, Oak Ridge has the Summit supercomputer, which is the most powerful computer in the world and runs at about 200 petaflops. The Frontier supercomputer will be 7 times more powerful than Summit. To prepare for the launch of the new computer, 24 specific application simulations spanning from molecular dynamics to carbon capture to astrophysics are being produced.

Tom’s team is focusing on creating the nuclear reactor simulation application. Each application is built around a challenge problem so improvements in computational ability can be tested. For Tom, this means modeling small modular reactors (SMRs).

The challenges of modeling new reactor technology (20:09)
20:09-27:33 (Tom explains the difficulty in modeling new technology. He also describes the reason why simulations have replaced experimentation.)

Q. Is it easier to model new technology or has this been a challenge?
(20:28) A. The short answer is, it’s complicated. Simple nuclear reactor models of existing designs are run on laptops by vendors and utilities to optimize and understand operating conditions. These are then correlated with the operational and experimental data collected from facilities around the US throughout time. These correlations are used to finetune simple models.

With a new design, there is no historical experimental or operational data for comparison. The models then requires a higher degree of confidence in the physics and calculations used. The difficulty of this depends on what the design is. An existing PWR900 design may be difficult to simulate such things as departure for nuclear boiling because the reactor is so large, requiring a high degree of computation. The new SMR designs are about a quarter of the size of a PWR model, meaning simulations require a smaller number of computational equations. But because the system is smaller, there are features that can increase the difficulty of modeling, such as natural circulation. These designs require no pump to drive coolant through the core of the reactor and is instead driven by natural circulation, meaning Tom’s team must accurately predict the fluid flow effects as well as the interaction between the fuel and the flow.

All new designs lack operational data and there are no experimental testing facilities for these designs, forcing researchers to fill in the data gaps with high-resolution modeling rather than experiments. Simulations and modeling have replaced experiments because experiments today are much more expensive to run that they were 50 or 60 years ago.

Optimizing designs and reducing costs with modeling (27:34)
27:34-36:20 (Tom explains that modeling and simulations are used to optimize designs and ultimately reduce costs. He uses the CRUD example to explain this point.)

Q. Why do we need models and simulations?
(27:58) A. There is a strong safety and regulatory component. The Nuclear Regulatory Commission (NRC) must first certify a design before a facility can become licensed and operational. Modeling also extends the collective knowledge base. Engineering science is focused on understanding and optimizing the design process. Engineers what to run higher resolution simulations faster and more accurately to understand what happens under different constraints and scenarios.

CRUD, Chalk River Unidentified Deposits, is an example of where modeling and simulation has helped the nuclear industry. Chemical buildup forms outside the clad of a reactor, impacting the fuel performance of a fuel cycle. Fuel cycles originally had highly conservative margins, operating only for 9-12 months. The Castle Project was the original energy innovation hub and a DoE project in 2012. As part of this project, Tom’s team worked to understand if coupled chemistry, neutronics and flow calculations could be used to model and predict the chemical process that forms CRUD deposits. The more we learn about fuel cycles, the more we can reduce safety margins, meaning reactors can safely operate for longer, reducing costs.

Replacing experiments with simulations (36:21)
36:21-42:24 (Tom explains that Oak Ridge focuses on providing simulated data to design firms to replace the need for experimentation.)

Q. Do you ever model what a plant could do if safety regulations were not in place?
(37:04) A. When developing high performance tools for validation, Tom’s team must be careful to balance what they are doing inside of the regulatory frameworks in place. This is because the NRC could require each design simulation presented to the NRC to be run on the supercomputer, of which there is only one in the US. Tom’s team therefore works to ensure that designs can be certified in a way that does not depend on Oak Ridge’s supercomputer. The goal of simulation is to replace experiments and provide data to design firms, enabling them to benchmark their own work flows in the absence of experimental data. While the computational power of Oak Ridge is much higher than industry computers, Tom states that today’s supercomputers are tomorrow’s desktops, meaning the technology will be widely available in a few years, enabling design firms to run their own simulations in the not too distant future.

Restrictions of modeling SMR designs (42:25)
42:25-49:29 (Tom explains the legal restrictions facing Oak Ridge when it comes to modeling SMR designs. He also explains the need for more industry-lab partnerships.)

Q. Why can’t SMR models be submitted to regulators without needing a physical test to acquire a license to build?
(43:16) A. As a national laboratory, Oak Ridge is not building reactors and must avoid conflicts of interest. They are therefore not included in a company’s design process. They are helping industry, however, and are working with companies that are designing advanced reactors by helping them with calculations and models. There are many legal and proprietary hurdles that Tom’s team must overcome, such as not being able to access the exact SMR designs. Because Oak Ridge is an open science lab, all work is made public, meaning intellectual property rights could be accessed by a design firm’s competitor. Additionally, Tom’s team is trying to push what is possible. Companies sometimes try to incorporate Oak Ridge’s new findings into designs, but should instead be working with what already exists. This is because the NRC would require designs that include these new findings to be simulated on Summit, which requires a proposal and is a highly competitive process.

Oak Ridge is trying to establish a more looser coupling with industry partners to help improve designs. Oak Ridge does occasionally obtain specific industry data to create direct models, but this is not a regular occurrence. Tom believes this should occur more frequently in the nuclear sector. He gives the example of how the aerospace industry has embraced high performance computing and has adopted computational iteration to do develop new designs. Tom hopes to see more informal partnerships between labs and industry to push the entire nuclear industry forward.

A future of SMRs (49:30)
49:30-52:45 (Tom describes the future he would like to see for the nuclear industry, focusing on SMRs to reduce cost.)

Q. What do you see as the future of nuclear?
(50:02) A. Tom sees a continued emphasis on smaller, more portable reactor designs. With SMRs, one site could handle the building, safety and capital costs of core construction. The core could then be transferred to and installed in an existing plant. Tom believes this prefabrication approach works best because capital costs are a major problem in the nuclear industry. Tom would also like to see smaller, more agile cores in new reactor designs. This also means more modeling simulations can be used to increase confidence in the operating parameters and constraints.

Sign up for our newsletter

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.