Using the Frontier supercomputer at the Department of Energy's Oak Ridge National Laboratory, researchers from the Georgia Institute of Technology have performed the largest direct numerical simulation (DNS) of turbulence in three dimensions, attaining a record resolution of 35 trillion grid points. Tackling such a complex problem required the exascale (1 billion billion or more calculations per second) capabilities of Frontier, the world's most powerful supercomputer for open science.
The team's results, published in the Journal of Fluid Mechanics , offer new insights into the underlying properties of the turbulent fluid flows that govern the behaviors of a variety of natural and engineered phenomena - from ocean and air currents to combustion chambers and airfoils. Improving our understanding of turbulent fluctuations can lead to practical advancements in many areas, including more accurately predicting the weather and designing more efficient vehicles.
"Turbulence has long been recognized as a grand challenge problem for both science and computing. Resolution is the key, and this research is about pursuing advances in the fundamental understanding of turbulence by employing high-resolution simulations with the right parameters. This work will have numerous implications for computer modeling and practical applications in many disciplines in which the flow of air, water or other fluids plays an important role," said P. K. Yeung, the Georgia Tech professor of aerospace engineering who leads the project.
Among the many challenging problems in fluid dynamics, turbulence stands out because of its disorderly fluctuations over a wide range of scales in both time and space. Adding to the complexity is that turbulent flows often occur in a wide variety of geometries that modify the physics of the flow. Picture the differences between water churning over a rough riverbed vs. water flowing through a smooth pipe.
However, the small-scale motions of turbulence possess a considerable degree of statistical universality, regardless of the overall flow geometry. This concept of small-scale universality is thought to be of greater validity as the range of scales widens and the turbulence becomes more intense. But simulating such dynamics has been difficult to achieve on computer systems prior to Frontier.
"Understanding the fine-scale properties of turbulence in a simplified geometry amenable to fast computation can provide great benefits for understanding and modeling turbulence. The latter, in turn, requires immense computational power, and Frontier is among the very best," Yeung said.
The scale of the simulations on Frontier has reached such a point that we are within reach of experiments as far as the range of scales that can be simulated numerically or can be made to happen in the laboratory. We're at a point where we can say that the numerical simulations' results are very reliable, and they can allow us to settle some of the hypotheses about turbulence.
Attaining new scales of insight on Frontier
On Frontier, Yeung and his team were the first in the world to simulate turbulence in 3D at a resolution with as many as 32,768 grid points in each dimension, exceeding 35 trillion grid points in total. (In computer simulations, grid points represent specific locations where variables are calculated - the more grid points, spaced closer together, the more accurate the results.)
"This is a scale that exceeds the capacity of any other machine in the world," Yeung said. Additionally, they were able to simulate flows at a very high Reynolds number - 2,500 - which provides a higher degree of physical fidelity than possible in prior work. The Reynolds number measures the ratio of inertial forces (that tend to keep moving) to viscous forces (that oppose motion due to internal friction). A flow with a low Reynolds number tends to be slower and smoother, such as paint being poured from a bucket, whereas a flow with a high Reynolds number will likely be more turbulent, such as rainfall within storms.
"The scale of the simulations on Frontier has reached such a point that we are within reach of experiments as far as the range of scales that can be simulated numerically or can be made to happen in the laboratory," Yeung said. "We're at a point where we can say that the numerical simulations' results are very reliable, and they can allow us to settle some of the hypotheses about turbulence. We can test fundamental theories to get some idea about how we can make corrections - because all turbulent theories for phenomena this complex are inevitably imperfect."
An important question is how large the largest turbulent fluctuations can be in fully developed turbulence. Extreme events in which intense fluctuations are rare and localized in time and space can lead to major consequences but are often not sufficiently accounted for in classical theories. Examples include extreme weather (such as category-5 tornadoes and record rainfall), local pockets of high air contamination and instabilities that can lead to sporadic auto-extinction in internal combustion engines.
"In turbulence, fluctuations of significance are observed in many flow properties. Even small fluctuations can have tremendous consequences. They are somewhat random, but fluctuations are still subject to physical laws, and they occur in time and space. So, instead of attempting to calculate how much rain will fall a week from now, we say, 'The probability that rain will fall next week is X percent.' We convert the question from a deterministic one to a stochastic or statistical one. We're interested in finding out the probability distributions," Yeung said.
The team's paper provides a definitive assessment of the difference between the probability distributions of energy dissipation (or how effectively energy is converted from the kinetic energy of the bulk flow to small-scale fluctuations and heat) and those of enstrophy (a measure related to the intensity of localized swirling, or vorticity), both of which govern the local details of fluid motion. Understanding that difference can aid in making predictions for the behavior of turbulent flows, such as in extreme weather.
The study's results also show that - even at the highest resolutions and during the most extreme turbulence - many classical scaling laws still hold true, including the "dissipative anomaly," which is the idea that the average energy dissipation rate is nearly independent of fluid viscosity at high Reynolds numbers. At the same time, the present simulations confirm that corrections accounting for the intermittent nature of small-scale turbulence physics are stronger than commonly assumed.
To achieve these results on Frontier, Yeung and his team implemented a simulation protocol called "multiresolution independent simulation" with an allocation of compute time from DOE's Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program .
This approach involved running multiple short, high-resolution bursts on top of longer but lower-resolution simulations. By carefully "upgrading" the resolution over short times and then averaging over many of these segments, they managed to study the smallest scales of turbulence without needing to simulate the whole flow for a long time. Yeung's doctoral students, Rohini Uma-Vaideswaran and Daniel Dotson, have been using the data to advance machine learning and visualization to illuminate the intricacies of 3D turbulent flow.
Yeung's current work conducting computer simulations of turbulence comes roughly 50 years after this field of research began and over 30 years since he began studying it himself. Efficient use of Frontier has now taken the simulations to a Reynolds number comparable to experiments but with the advantage of being able to provide incredible detail despite the simplified geometry.
Some of the data from Yeung's team are now publicly available online at the Johns Hopkins Turbulence Database , which receives funding from the National Science Foundation. According to Charles Meneveau, JHTDB principal investigator, the 35-trillion-grid-point DNS dataset prepared by the Georgia Tech team is already attracting significant interest and will likely be leveraged in many future publications by JHTDB users.
Frontier is managed by the Oak Ridge Leadership Computing Facility, a DOE Office of Science user facility located at ORNL.
UT-Battelle manages ORNL for DOE's Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOE's Office of Science is working to address some of the most pressing challenges of our time. For more information, visit energy.gov/science . -Coury Turczyn