Toward Quantum Computers That Tolerate No Mistakes

What the research is about

Ripples spreading across a calm lake after raindrops fall-and the way ripples from different drops overlap and travel outward-is one image that helps us picture how a quantum computer handles information.

Unlike conventional computers, which process digital data as "0 or 1," quantum computers can process information in an in-between state where it is "both 0 and 1." These quantum states behave like waves: they can overlap, reinforcing one another or canceling one another out. In computations that exploit this property, states that lead to the correct answer are amplified, while states that lead to wrong answers are suppressed.

Thanks to this interference between waves, a quantum computer can sift through many candidate answers at once. Our everyday computers take time because they evaluate each candidate one by one. Quantum computers, by contrast, can narrow down the answer in a single sweep-earning them the reputation of "dream machines" that could solve in an instant problem that might take hundreds of years on today's computers.

The key challenge is whether a quantum computer can maintain the "both 0 and 1" state while performing operations. This state is extremely delicate and can collapse under the slightest disturbance. Tiny temperature fluctuations, faint electromagnetic noise from nearby equipment, or minute vibrations-small changes in the environment-can easily disrupt the computation. That is why a foundational technology called quantum error correction is essential: it detects errors caused by noise and repairs them.

Over many years, researchers have developed numerous quantum error-correction methods, but every approach faced a limit-an accuracy ceiling beyond which it could not go. In theory, there is an ultimate bound on performance (the hashing bound), yet conventional methods could not come close to it. As a result, they could not deliver the extremely high accuracy needed to run large-scale quantum computers, creating a major barrier to practical applications.

Why this matters

A team led by Associate Professor Kenta Kasai at Institute of Science Tokyo (Science Tokyo) has discovered a new quantum error-correction method that comes extremely close to the theoretical limit-the hashing bound.

In conventional quantum computer designs, there has been a built-in flaw that causes information to trigger errors during computation. Because of this, some errors inevitably remained no matter how favorable the conditions were. The research team developed a new mechanism that eliminates this source of error, successfully pushing the computational accuracy of quantum computing to almost the theoretical limit.

A major feature of both this new mechanism and the newly developed error-correction method is speed. Traditional approaches require heavy computation to fix errors, making them impractical for quantum computers at large scale. With the new method, however, the time needed for error-correction computation barely increases even as the number of components grows. By achieving both "ultimate accuracy" and "ultra-fast computational efficiency", the team has removed a major obstacle that has stood in the way of large-scale, practical quantum computers.

What's next

This achievement brings practical quantum computing much closer. Until now, large-scale quantum computation involving millions of qubits (quantum bits) was often seen as little more than a dream. This research brings that dream within reach.

If this technology is established, it will remove one of the biggest barriers to building large-scale quantum computers. A future in which quantum technology plays a vital role in areas that underpin society-such as drug discovery, more secure cryptographic communication, and climate prediction-will feel more realistic than ever.

Comment from the researcher

When working on error-correcting codes, I often come across small places where things do not behave quite as well as they could.

Noticing those details, and adjusting the design step by step, is how the work gradually moves forward.

Progress in research rarely comes from dramatic breakthroughs alone. More often, it grows out of careful observation, patience, and a willingness to reexamine assumptions-even those that are popular or widely accepted.

In that sense, curiosity is not about chasing what happens to attract attention. It begins with sensing that something is slightly off, and taking the time to understand why. From there, new ideas tend to emerge naturally.

(Kenta Kasai:Associate Professor, Department of Information and Communications Engineering, School of Engineering, Science Tokyo)

Associate Professor Kenta Kasai

Dive deeper

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.