For as long as scientists have been trying to understand the behavior of the electrically charged fourth state of matter known as plasma, there have been equations and calculations, and the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) has been a leader in both for 75 years. Through continuous innovation, PPPL remains a force in developing critical fusion computer programs that model plasma's behavior. Those models are leading to a greater understanding of plasma, enabling engineers and physicists to design fusion machines with increased speed and efficiency. And these computer tools are an investment by the federal government that has only increased in value over time.
PPPL scientists use codes to solve a range of fusion challenges. Some problems deal with fine-grained views of plasma, like the movement of individual particles that can cause the plasma to escape its confining magnetic fields. Other challenges focus on larger views, like simulating the movement of heat throughout the plasma as a whole or tailoring the behavior of fusion plasma in real time. At the largest scale, codes now simulate entire devices - allowing engineers to design and optimize fusion systems without building costly prototypes.
For these technical challenges and more, PPPL develops codes that are pushing the development of fusion energy strongly into the future. Harnessing and advancing these capabilities will be key to achieving the goals of DOE's Fusion Science and Technology Roadmap to close gaps on the critical path toward fusion energy. And with the DOE's launch of the Genesis Mission, a major initiative to accelerate scientific discovery and enhance national security using artificial intelligence (AI), PPPL is in a strong position to help lead.
Generations of PPPL scientists and engineers enabled the plasma physics breakthroughs that shape the field today. Building on that foundation, PPPL continues to deliver scientific advancement, technologies and partnerships that are defining the future of plasma science and fusion.
Challenge 1: Improving fusion system performance and design using AI
The quick growth of AI computing power and sophistication has led to tangible results in fusion, and PPPL has been at the forefront of the effort. Scientists have long been searching for ways to predict the occurrence of plasma instabilities that can quench fusion reactions so that they can take action to prevent them. Today, PPPL scientists are tackling that problem by writing codes incorporated into AI systems. Those systems can now make accurate predictions, allowing the machine's control system to make adjustments and preserve the plasma's stability.
VIDEO: What does AI have to do with fusion?
Egemen Kolemen, a Princeton University professor and staff research physicist at PPPL, and his team have created AI systems that were used on the Korea Superconducting Tokamak Advanced Research in South Korea and the DIII-D tokamak run by General Atomics in San Diego.
Recent experiments on these two machines have shown that the AI systems can help create and sustain plasma with minimal energy loss and no unwanted energy bursts at the edge, a notable achievement.
At the same time, PPPL engineers are using AI to create digital surrogates, improving the design of stellarators, a type of fusion machine invented at PPPL that looks more like a cruller than a doughnut. By developing computer codes that can learn from past stellarator design experience, scientists and engineers can more quickly derive the shapes of magnets that might best boost stellarator plasma performance. Such a method would be far more efficient than running calculations for each potential configuration.
"Because magnet coil shapes dominantly determine the machine's performance, computation has an outsized return on investment for stellarators," said Michael Churchill, head of digital engineering. "That's why we are spending so much time on this effort." Private companies are interested in this approach, and some, like the startup Thea Energy, have been working with PPPL to improve their designs.
VIDEO: What is a digital twin?
Such projects require a strong community of interest. This is why PPPL recently established the AI for Science Group, which accelerates discoveries, enhances data analysis and develops innovative solutions to complex problems by integrating AI into scientific workflows. This interdisciplinary initiative, which includes the AI4Fusion effort, opens new avenues across domains beyond fusion and plasma physics, including materials science, electromanufacturing and aerosol science. The group enables advances in computer science, data science and translational research across these areas.
The Lab is also actively involved in AI-focused national initiatives, such as the DOE's Genesis Mission and its STELLAR-AI fusion-focused computing platform. It can take months to run a single high-fidelity computer simulation or to train an AI system capable of designing an ideal fusion machine using existing infrastructure. STELLAR-AI is designed to reduce that timeline by using AI. The platform connects computing resources directly to experimental devices, including PPPL's National Spherical Torus Experiment-Upgrade (NSTX-U), which is scheduled to go live in 2026, allowing researchers to analyze data as experiments occur.
"The Genesis platform is an integrated, ambitious system that will bring together the various unique DOE assets: experimental and user facilities, the supercomputers, data archives and, importantly, the AI models," said Shantenu Jha, head of PPPL's computational sciences and a member of the executive council of the Genesis Mission. "We're proud to be part of it."
Challenge 2: Predicting whether fusion plasma will be stable
Operating a magnetic fusion system entails making sure the plasma stays within the magnetic cage that confines it. PPPL helps solve this challenge by advancing the development of magnetohydrodynamic (MHD) programs, which assume that plasma is a continuous, electricity-conducting fluid. One of the purposes of these codes is to calculate whether a particular plasma shape will be stable.
Stability is key because if the pressure of the plasma pushing against the magnetic fields confining it becomes too strong, or if the electric currents flowing through the plasma become too powerful, the plasma can deteriorate, making fusion impossible.
Calculating stability is a challenging computational task. "Fusion plasmas have highly complex behavior across an enormous range of length and timescales, from millimeters to meters and from nanoseconds to hundreds of seconds," said Nate Ferraro, deputy head of theory. "This behavior is too hard to simulate in full detail, even with the largest computers in the world. Part of the challenge is knowing which simplifying assumptions can be made so a model can make accurate predictions at a reasonable computational cost. This is where an understanding of physical principles is critical."
Specifically, PPPL maintains a code known as M3D-C1, which was developed by PPPL principal research physicist Stephen Jardin in 2004 and is continually updated with new techniques and capabilities. "We use M3D-C1 to examine large-scale instabilities that affect the entire plasma," said Ferraro. "We want to understand these instabilities at the design phase. If they occur in a fusion power plant, the plasma will cool; the fusion reactions will stop; and the facility could be damaged."
Challenge 3: Boosting the development of the private fusion industry
M3D-C1 is aiding the development of commercial fusion energy. Commonwealth Fusion Systems used it to conduct extensive simulations when designing its demonstration fusion device, SPARC, and is currently using the code to calculate how and whether the plasma will experience disruptions when the plasma suddenly escapes the magnetic fields holding it.
VIDEO: A computer simulation produced by M3D-C1
The video shows a computer simulation produced by M3D-C1, a PPPL MHD code that solves equations describing plasma as an electrically conducting fluid composed of ions and electrons. This code is primarily used to calculate the equilibrium, stability and dynamics of fusion plasmas. (Render credit: Nate Ferraro / PPPL)
Companies use codes like M3D-C1 because they cannot always rely on past experiences with smaller devices. Large-scale devices, like the demonstration fusion power plants they want to build, may have different behaviors. But since no such devices yet exist, there is no experimental data to examine for guidance.
"Our simulations can help give confidence to investors," said Greg Hammett, a principal research physicist. "If you can use computer codes to predict that you can build a fusion power plant smaller than you thought and get the same performance, you can build it more quickly and potentially save billions of dollars."
Private companies from around the world continue to come to PPPL for help. "These companies want to work with us because of our extensive expertise with these codes," Ferraro said. "Our codes can produce the high-fidelity simulations they need to optimize and validate their designs. These codes take specialized skills and years of investment to produce, and private industry depends on the public fusion program to develop them."
Challenge 4: Analyzing large-scale performance
Scientists use PPPL codes to tackle another fusion problem - simulating the movement of heat through the plasma to analyze the plasma's performance at large scales. The PPPL code known as TRANSP, sometimes known as transport codes, is now one of the most frequently used fusion codes around the world. Among other facilities, scientists use it to interpret experimental results produced by the DIII-D tokamak operated by General Atomics for DOE.
"TRANSP simulates how a tokamak plasma forms, heats up and spins, following the flow of heat, particles and momentum across the plasma while accounting for heating sources like neutral beams and radio waves, as well as losses and radiation," said Alexei Pankin, a computational scientist and manager of the program. "Unlike other codes, TRANSP focuses on the slower evolution of the plasma, providing an overall picture."
TRANSP's widespread adoption is no accident. The current version is a direct descendant of the original, which was developed in the 1970s by PPPL scientists Richard Hawryluk, Douglas McCune and Robert Goldston, and improvements continue to this day. TRANSP was the first code to model the entire fusion plasma in a tokamak, from the core to the edge, incorporating both the movement of heat and large-scale instabilities. And while other groups around the world have used it as a starting point to develop their own codes, the original ideas and techniques came from PPPL.
Challenge 5: Building fusion systems more economically
Researchers use another type of computer program known as a kinetic code to understand the velocities of plasma particles. The resulting information allows researchers to understand plasma turbulence, which carries heat out of the plasma and makes it harder to sustain fusion reactions. If researchers can increase understanding of turbulence, they can build smaller fusion power plants and lower construction costs.
PPPL scientists recently used an in-house kinetic code to make a significant finding that boosts confidence in the performance of ITER, an experimental fusion facility now under assembly in southern France. ITER is designed to demonstrate the production and control of a sustained fusion power source for hundreds of seconds at a power plant-relevant scale. The code, known as the X-Point Included Gyrokinetic Code (XGC), began with an initial version written primarily by Choongseok Chang, a managing principal research physicist, about 20 years ago and has been continually improved and expanded to add new capabilities, allowing it to run on new computer architectures.
Unlike most codes at the time, XGC was designed to simulate how turbulence interacts with the X-point, a place in the plasma at the transition point between the core, where magnetic field lines close on themselves, and the scrape-off layer, where field lines meet the wall. Chang believed simulating these multiscale interactions was crucial to understanding fusion plasma and designing better tokamaks.
In 2015, DOE funded research allowing Chang to use XGC to simulate the flow of heat in the area of the tokamak called the divertor, which acts as an exhaust system that removes heat and particles. Previous calculations indicated that an immense amount of heat would focus on an area of the divertor just a few centimeters across, potentially damaging internal components and leading to downtime for repairs.
But Chang and his team used XGC to show that, in fact, the heat and particles would strike an area 12 times as wide as predicted. This finding from XGC showed that ITER could operate with fewer constraints than previously thought, which is good news for its operations.
VIDEO: An XGC simulation
The video shows an XGC simulation of how plasma from the pedestal region is connected through the supposedly last confinement surface into the divertor plasma region. The long and thin lobes fluctuate in time and space. (Render credit: Seung-Hoe Ku / PPPL, on the DOE's Summit computer at Oak Ridge National Laboratory; image courtesy of Dave Pugmire and Jong Youl Choi / Oak Ridge National Laboratory)
The helpful results bolstered PPPL's reputation as a computing powerhouse. "When other teams want to know something about the fundamental physics of plasma boundaries that their codes can't explain, they come to us," Chang said.
Standing on the shoulders of giants, delivering what comes next
The roots of the U.S. fusion program began at PPPL, establishing many of the scientific and engineering foundations of plasma physics and fusion energy. Today, PPPL continues to lead in plasma physics, advancing cutting-edge research in an expanding range of fields that are shaping the future of energy, technology and discovery.
PPPL's role in shaping the future of fusion has been intentional. The fundamental physics equations describing plasma were developed by theoretical physicists at PPPL, gathered for this purpose by founder and Princeton University astronomy professor Lyman Spitzer in the 1950s and 1960s, virtually creating the field of plasma physics from scratch. But the equations that describe the behavior of plasma were too hard to solve except in very simple cases. So soon after, PPPL began pioneering the creation of sophisticated computer programs based on those equations that could model plasma behavior.

Lyman Spitzer was an astronomy professor and the founder of the Princeton Plasma Physics Laboratory. (Photo courtesy of PPPL Archives)
Since then, theorists have aimed to understand plasma more accurately by continually updating their equations and codes. Because even today's biggest, fastest computers cannot solve the full equations that describe the range of plasma behavior in a fusion device, theorists are constantly formulating equations and approximations that computers can solve and that model the behavior accurately enough for research purposes.
"PPPL has always recognized the importance of computing," said Hammett. "In fact, one of the first people to model plasma behavior using computer simulations was John Dawson, a member of PPPL's Theory Department in the 1960s. Though he could simulate only 100 interacting particles - today we can simulate 100 trillion particles - his simulations helped lead to an understanding of a key method for heating plasmas. It was a groundbreaking achievement." Dawson's career contributions were so significant that the American Physical Society renamed the Award for Excellence in Plasma Physics after Dawson in 2007. Several PPPL researchers have also won this award, which recognizes a recent outstanding achievement in plasma physics research.
PPPL's success in developing codes is rooted in a history of innovation. "By writing computer programs that would calculate stability with greater precision than before, PPPL scientists in the 1960s, 1970s and 1980s helped enable the design of better tokamaks," Hammett said. "For example, early plasma cross sections looked like a circle. But computer programs showed that stability would improve if the plasma were shaped so its cross section resembled the letter 'D.' Recently, codes and experiments are showing how a reverse-D shape might be even better."
VIDEO: Learn about Lyman Spitzer
Uniquely qualified to write plasma codes that can change the world
National labs have the capabilities to conduct research that no one else can. Researchers at PPPL can devote time to long-term projects using large-scale facilities not available elsewhere. "National labs are particularly well suited to doing risky research on extended timelines," said Ammar Hakim, a PPPL principal research physicist. "But labs like ours have long histories of doing things like this. We are in a unique position to do research that other places may not be able to do."
Writing these bespoke plasma codes requires specialized knowledge encompassing plasma physics, applied mathematics, numerical algorithms and programming skills. The number of people who have all these qualifications is small, Hakim said. "But PPPL has a history of bringing these people together to accomplish amazing things."
PPPL's established role as an originator of plasma physics and one of the most significant innovators in creating computer codes positions the Laboratory as a critical node in worldwide fusion physics research. "Scientists want to use what we learn from experiments to try to predict what plasma in fusion devices will do and then build new devices," said Felix Parra Diaz, head of theory. "Using the fundamental computer codes developed at PPPL allows us to achieve this goal."