ORNL Plans Quantum-HPC Software Stack Study

Illustration of frontier supercomputer integrating quantum components
ORNL researchers proposed a software architecture that would integrate emerging quantum computers with the world's fastest supercomputing systems, such as ORNL's exascale machine Frontier. Credit: Jason Smith/ORNL, U.S. Dept. of Energy

A new study by researchers at the Department of Energy's Oak Ridge National Laboratory traces a blueprint for a software architecture that would integrate emerging quantum computers with the world's fastest supercomputing systems.

Finding an effective approach to pair the two distinct computing platforms has become a prime focus for scientists seeking to tap quantum computing's potential power. The computational capacity of quantum computers, still an emerging computing technology, could ultimately exceed that of classical computers for applications such as scientific modeling.

"The goal is to promote rapid advancement of this coming convergence," said Amir Shehata, an ORNL software engineer and lead author of the study. "Similar efforts are underway in Europe and Japan. We don't expect ours to be the final version of the software framework, but we want to get ahead of the curve and to drive development with as many people as possible participating. For this approach to succeed, we'll need a flexible software stack that couples these two technologies in a robust but modular way so we won't have to retool the entire stack every time we get a new quantum machine."

Key innovations of the ORNL study include:

  • A unified resource management system that efficiently coordinates quantum and classical resources
  • A flexible quantum programming interface that abstracts hardware-specific details
  • A quantum platform management interface that simplifies the integration of various quantum hardware systems
  • A comprehensive tool chain for quantum circuit optimization and execution

An ORNL study published last year examined potential strategies to integrate quantum computing and high-performance computing, or HPC. Shehata's study builds on that effort's findings to establish more concrete guidelines for putting that integration into practice.

"Our previous study focused on the foundations for integration," said Rafael Ferreira da Silva, the leader of ORNL's Workflows and Ecosystem Services Group and a co-author of both studies. "This study focuses on the software architecture and design, along with what we need to do to translate these strategies into reality."

Quantum computing relies on quantum bits, or qubits, to store information. Qubits, unlike the binary bits used in classical computing, can simultaneously exist in more than one state via superposition, a quantum mechanical effect that allows combinations of physical values to be encoded on a single object. That dynamic enables a wider range of possible values - more like a dial with finely tuned, to-the-decimal-point settings than a binary on-off switch.

The next frontier

Researchers have theorized the expanded computational range enabled by quantum technology could provide new and more efficient ways to solve complex problems, such as high-resolution digital simulations. The team compared the potential boost in compute power to the breakthrough speeds achieved by combining CPUs with GPUs. That powerhouse combination broke the exascale barrier in 2022 when the Frontier supercomputer at ORNL's Oak Ridge Leadership Computing Facility achieved speeds of more than 1 quintillion calculations per second.

"We're talking ultimately about another exponential increase in the scale of problems we could tackle," said Tom Beck, a co-author of the two studies and head of the Science Engagement Section at ORNL's National Center for Computational Sciences . "The current generation of quantum computers can in theory utilize hundreds of qubits. Frontier at its top speeds could theoretically model the equivalent of only about 50 or 60 of those qubits because each added qubit in a quantum model would double the computing demand on Frontier. Harnessing this quantum advantage could be a tremendous accelerator that would really pump up our problem-solving capacity, especially as newer and more powerful quantum computers continue to be developed and refined."

The main obstacle to realizing quantum computers so far has been the relatively high error rate caused by the delicate nature of qubits. Researchers have tested various solutions, but the industry hasn't settled on a standard protocol. The final medium for encoding qubits also remains a moving target, with various systems employing neutral atoms, trapped ions, superconductors and other materials.

"That's why we have to make sure the software framework will be malleable enough that it can be adjusted for whatever final version of quantum computing technology emerges," Shehata said. "Quantum computers will continue to evolve, so our framework must do the same."

The study provides a design for a comprehensive tool chain for quantum circuit optimization and execution, allowing programmers to produce performance-portable hybrid applications.

Shehata's proposed framework would be used for a quantum computer deployment on the same site as a classical HPC system such as Frontier. A quantum controller would connect the two machines and act as a kind of interpreter device, translating between quantum and classical computations. The team proposes a specific quantum platform management interface that would simplify this integration and translation, making a variety of combinations easy to deploy.

Most of the software would operate on the classical side, and the quantum machine would act more as an accelerator than an equal partner to the supercomputer. An algorithm would prioritize data traffic, and various applications would monitor speeds and other vital signs. The unified resource management system developed by the team would aid in this coordination.

"To build the necessary software ecosystem for these two platforms, we need reliable measures of performance," Shehata said. "We don't want bottlenecks created by the applications. That means we'll want a scheduling algorithm that directs data traffic and makes sure we're getting the best results out of the quantum machine."

Quantum computing continues to develop rapidly, and its final form may be a long time coming - and might look nothing like current iterations. The team said they're prepared in that case too. The study proposes a flexible quantum programming interface that abstracts hardware-specific details, allowing future designs to be included without fundamentally changing the programming model.

Support for this research came from the DOE Advanced Scientific Computing Research program and from ORNL's Laboratory Directed Research and Development program. The OLCF is a DOE Office of Science user facility at ORNL.

UT-Battelle manages ORNL for DOE's Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOE's Office of Science is working to address some of the most pressing challenges of our time. For more information, visit energy.gov/science . - Matt Lakin

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.