
Simulations still can't predict exactly when an earthquake will happen, but with the incredible processing power of modern exascale supercomputers, they can now predict how they will happen and how much damage they will likely cause.
Imagine, a colossal earthquake strikes the California coast along the San Andreas Fault, one of the world's most active areas of seismic activity. Scientists have long been predicting this so-called "Big One." But instead of chaos, there's calm, thanks in part to an advanced early warning system that gave plenty of notice for people to take cover inside specially engineered, quake-resistant structures.
The Department of Energy's Office of Cybersecurity, Energy Security, and Emergency Response, or CESER, is supporting a project led by David McCallen, a senior research scientist at Lawrence Berkeley National Laboratory who is working to make this potential future a reality - one in which earthquakes are met with preparedness, not panic.
McCallen is leading a team of researchers from Berkeley and Oak Ridge national laboratories to develop the most innovative and advanced simulations to date for studying earthquake dynamics.
The simulations reveal in stunning new detail how geological conditions influence earthquake intensity and, in turn, how those complex ground motions directly impact buildings and infrastructure. The data is already being shared with the broader earthquake science and engineering communities to deepen the understanding of seismic behavior and to guide the designs of earthquake-resistant infrastructure and improve emergency response.
"Our goal is to model earthquakes from beginning to end and track the seismic waves as they propagate through the Earth," said McCallen, who also leads the Critical Infrastructure Initiative at Berkeley Lab. "We want to understand how those waves interact with buildings and critical energy infrastructure to assess their vulnerability so they can be as prepared as possible before the next earthquake strikes."
The research began in 2017 as part of the Exascale Computing Project , or ECP, DOE's largest-ever software research, development and deployment initiative. The project charged experts with designing applications and solutions for complex scientific problems that were impossible to solve with the computing capabilities that existed prior to ECP.
Traditional earthquake simulations have relied on rough estimates based on data from past events to study earthquake ground motions. However, until now, scientists have lacked the computational power to model earthquakes in specific locations with sufficient fidelity.
From ECP, McCallen and his team developed EQSIM, the Earthquake Simulation code. EQSIM allows researchers to see how seismic waves interact with different soil compositions and surface topologies such as mountains and valleys that can either amplify or dampen an earthquake's energy and momentum. The ground motion simulations can then be applied to buildings and critical infrastructure, such as water and electric utilities providers, to see how those structures will respond to seismic activity and where they are likely to fail.
"We've advanced the ability to do these computations tremendously," McCallen said. "Instead of using empirical data from past events like we've had to do up to this point, the exascale simulations are allowing us to develop a much better picture of what these regional distributions of ground motions look like.
"This is something we've not been able to see before, and what we're seeing is that ground motion behavior is far more complex and dynamic than we thought."
Location, location, location
What most people might find surprising is that in some cases, smaller earthquakes can actually cause more damage than larger ones - it all depends on the underlying geological conditions, McCallen says.

The intense shaking during an earthquake, known as ground motion, is shaped by three key geological factors: 1) fault type - how tectonic plates shift against each other and the manner in which an earthquake fault ruptures; 2) rock and soil composition - whether the ground is solid or fractured, hard or soft; and 3) surface topography - including mountains, valleys and even buildings. All these factors influence the strength and behavior of seismic waves.
To better understand how earthquakes behave across different geologies, EQSIM is currently being used to model earthquake activity in three major U.S. fault zones: the San Francisco Bay Area, the Los Angeles Basin and the New Madrid region in the eastern Midwest.

The San Francisco Bay Area has dozens of fault lines, including the Hayward Fault - arguably one of the most dangerous fault lines in America. The Hayward Fault is of particular interest to McCallen given that his office at Berkeley Lab sits right next to it.
South of the Bay Area is the Los Angeles Basin, one of the country's most notorious hot spots for seismic activity. The magnitude 6.7 Northridge earthquake of 1994 is one of the most destructive earthquakes in U.S. history. The LA Basin includes the San Andreas Fault, which many researchers believe could undergo another major rupture at any time. Often referred to as "The Big One," the anticipated earthquake is hypothesized to be a magnitude 7.8 or greater.
A lesser-known area but equally as dangerous in terms of historic damage is the New Madrid region. New Madrid contains large active fault lines that cut across Kentucky, Illinois, Tennessee and Arkansas.

To this day, the New Madrid earthquakes of 1811 and 1812 are still the most destructive earthquakes in U.S. history to take place east of the Rocky Mountains. The 7- and 8-magnitude earthquakes were so powerful they were felt across more than 1 million square miles and as far north as Canada.
"The earthquakes caused so much geologic upheaval that residents reported the earthquake actually forced the Mississippi River to flow backward for a period of a few hours," McCallen said.
According to McCallen, eastern earthquakes in some cases can be worse than western earthquakes - not because the West is more mountainous, but because the soil and rocks in the East are more competent - meaning they are more resistant to deformation and have fewer cracks and damage - which allows seismic energy to be transferred farther and more efficiently.
"These are exactly the kinds of things we need to understand," said McCallen. "Fortunately, back in the 1800s, the area was sparsely populated. But if a large earthquake were to happen again today, Nashville, Memphis and Louisville could all be impacted."
Calculating quakes at the exascale
From their offices in California, the EQSIM team members can log into the Frontier supercomputer in Oak Ridge, Tennessee - just a few hours east of New Madrid. Managed by the Oak Ridge Leadership Computing Facility, or OLCF, Frontier is the world's most powerful supercomputer for open science.
Powered by AMD graphics processing units, Frontier has a peak performance of 2 exaflops per second - more than a billion-billion calculations per second - making it roughly 1,000 times faster than the previous generation of petascale systems.
Each of the three simulated regions spans hundreds of kilometers and is built from tens of thousands of pieces of geologic data. The models divide the terrain into grid zones a few meters in size, with each grid zone representing the local geologic structure at that point in space. The largest simulations can include up to 500 billion grid points, enabling them to capture an extraordinary level of geological and structural detail.
"The incredible computing power behind the simulations really allows us to see the hot spots of the seismic waves and where the energy gets directed through the different layers of rock and soil," McCallen said. "We can see clearly how and where the waves can stack up and how those ground motions translate into building risk and damage. And we can see they are exceedingly different at each location."
As seismic waves pass through different structures, the simulations consistently show that shorter, more rigid buildings - one to two stories tall - are most vulnerable to fast, high-frequency shaking in regions with solid rock and compact soil. In contrast, taller, more flexible buildings face greater risk in areas with soft soil and fractured geology, where low-frequency waves can grow in intensity and shake for longer periods.
The 1980 earthquake near Lawrence Livermore National Laboratory is an example of the latter. Less than an hour away from Berkeley, Livermore Lab is located in the Livermore Valley, a sedimentary basin surrounded by a craggy mountain range.
"It caused tremendous damage at the lab," McCallen said. "At that time, nothing was anchored down. There were no ground motion instruments on site and no forensic capabilities for mitigating earthquakes. However, it turned out to be a watershed event for DOE."
The earthquake led to the creation of the DOE Standard 1020 , a directive that established uniformly rigorous hazard and risk evaluations for all DOE facilities. The directive requires each site to review and reanalyze its strategy for mitigating seismic hazards and ground motion risks every 10 years or as new information becomes available.
As scientists and engineers, we have to determine what future earthquake ground motions could occur both in terms of amplitude as well as frequency content. Then we have to determine how those motions impact infrastructure, which is exactly what we designed EQSIM to do. When we look at simulations in areas like the Livermore Valley, we can see the energy tends to propagate and get trapped in those basins and shake them like a bowl of jelly. You can see the waves continue to resonate long after the earthquake ruptures.
The simulations are massive. The output from a typical simulation that follows 90 seconds of physical time is a little over 3 petabytes, similar in size to around 750,000 feature-length films or 1.5 trillion pages of text.
"There's no way we could have run these simulations before exascale and GPU-based systems." McCallen said. "But now we're able to do these calculations on a relatively routine basis."
EQSIM is highly scalable and was able to run on all 9,402 Frontier nodes, but peak performance for selected applications required only 3,600 nodes. The code is also optimized to run on multiple computing architectures, including Berkeley Lab's Perlmutter supercomputer, operated by the National Energy Research Scientific Computing Center, or NERSC.
To manage the vast amount of data, simulation results from Frontier were transferred to the OLCF's Andes computing cluster for temporary storage and postprocessing analysis. Compression algorithms were used to reduce the size of the data so that it could be transmitted to Berkeley Lab via the Energy Sciences Network, or ESnet, DOE's high-performance science network that connects all the national laboratories and user facilities.
"Not only are the simulations helping to analyze the structural performance of DOE research facilities, but we're also making efficient use of the entire DOE computational ecosystem to do it," McCallen said.
From code to capability
The success of the Frontier simulations has already opened the door to direct applications.
"There are very few groups that can do these types of computations," McCallen said. "So, we asked ourselves how we can best maximize the utility of these motions and get them into the hands of the people that are doing the research and designing the structures."
In collaboration with the Pacific Earthquake Engineering Research Center , or PEER - a consortium of universities based at the University of California, Berkeley - the team is developing an open-source web tool that allows engineers and building designers to access and use the regional simulated ground motions to test different building and infrastructure designs against simulated earthquakes across thousands of locations.
"At the PEER Center, we see this collaboration as a transformative opportunity to bridge cutting-edge simulations with real-world engineering practice," said Khalid Mosalam, director of the PEER Center and the Taisei professor of civil engineering at UC Berkeley.
"The scale and resolution of the EQSIM data offers unprecedented insight into regional seismic behavior and allows us to quantify uncertainties more rigorously and develop performance-based design strategies grounded in science," Mosalam said. "By making these simulated ground motions accessible through open tools, we are empowering the earthquake engineering community with knowledge that was previously unattainable - ultimately leading to safer, more resilient infrastructure."
The EQSIM team is also working with the UC Berkeley Seismology Lab and emergency management agencies to use the simulation data to enhance seismic instruments and improve emergency response systems.
"The best thing about these simulations is that we don't have to wait for the next 'Big One' to strike to understand how it will impact us," McCallen said. "If anyone needs information about a 7.5 earthquake in these critical areas, we can provide them with the comprehensive data that is being generated."
This research is supported by DOE's Office of Cybersecurity, Energy Security, & Emergency Response, which leads the Department's efforts to strengthen the security and resilience of U.S. energy infrastructure against all threats and hazards. It carries out its mission, in part, by helping to assess risks to energy systems and critical infrastructure by providing a more detailed understanding of regional ground motion complexity. For more information, visit energy.gov/ceser/office-cybersecurity-energy-security-and-emergency-response .
The OLCF, NERSC and ESnet are DOE Office of Science user facilities.
This story is also featured on OLCF's website.
UT-Battelle manages ORNL for DOE's Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOE's Office of Science is working to address some of the most pressing challenges of our time. For more information, visit energy.gov/science .