One Step Closer to Reducing Quantum Computing’s Notoriously Troublesome Errors

While still far from enabling practical quantum computing, new research demonstrates an impressive reduction in error rates for a simple 2-qubit system

3 min read

Closeup photograph of IBM Q Dilution Refrigerator
This dilution refrigerator houses the quantum computer behind IBM Q, a cloud-based platform that lets researchers and other companies experiment with the technology.
Photo: Graham Carlow/IBM Research

Most of the popular coverage of quantum computing gives the impression that this technology is poised for an imminent breakthrough, one that will revolutionize the world of computing. Of course, reality is far less dramatic.

Sure, sophisticated research into quantum computing is going on, as it has for decades. But more sober assessments, like the one the U.S. National Academies of Science, Medicine, and Engineering put out in December, suggests that the chances of quantum computers breaking Internet encryption or performing similar quantum calculations of practical interest anytime soon are slim.

Still, a development published last week helps boost the chance that quantum computing could one day be made useful. The work appeared in the prestigious journal Physical Review Letters. It demonstrated something that’s long been seen as key to making quantum computing work: reducing error rates through redundancy.

You see, the hardware used to encode quantum bits—qubits—in today’s prototype quantum computers is prone to all sorts of noise, which quickly corrupts the stored information. These prototypes have thus been called noisy intermediate-scale quantum computers, or NISQ computers. And so far, they serve no practical purpose other than as vehicles for research.

It’s been known for decades that specially designed error-correction schemes could, at least in theory, reduce error rates in quantum computations. But those schemes are based on certain assumptions about the character of the noise involved, and it’s not certain whether these assumptions hold for real NISQ machines. Researchers are just beginning to explore that question.

The pair of researchers who published their work last week, Robin Harper of the University of Sydney and Steven Flammia, who is affiliated with both the University of Sydney and Yale University, showed that a specific scheme can help to reduce error rates in a prototype quantum computer, one of IBM’s that has been made available to researchers to experiment with remotely, through something called IBM’s Quantum Experience, or IQX.

Details of how they accomplished that go way above my head, but let me sketch out as best I can the general strategy.

Error correction in quantum computing relies on redundancy, just as it does for conventional computing. Were there no errors, computation could be done using the raw physical qubits that the hardware offers. But in the face of noise, which is always present to some degree, you have to combine multiple physical qubits to make one logical qubit, which then becomes less prone to error.

Harper and Flammia did that for a system that involved just two logical qubits. The error-correction scheme they used required four physical qubits. It improved error rates in two ways. One was to substitute the largest source of noise in the physical circuit—something called a controlled NOT (or CNOT) gate—with a virtual CNOT gate that was more reliable. The other tactic was to detect when errors likely occurred so that those runs could be thrown out.

The result reduced error rates from about 6 percent, which is what this machine offered when just two physical qubits were used for the computation, to about 0.6 percent. While such an order-of-magnitude improvement is impressive, the researchers emphasize that their experiment does not consider what are known as state preparation and measurement errors. As a result, the computation as a whole is still not below a key level, the fault-tolerance threshold, that would allow more complex computations to be performed by virtue of being able to correct errors faster than they are created.

When will the day come when the fault-tolerance threshold is breached in real computations in real quantum-computing machines? That’s unclear. If it happens, though, it’ll probably be demonstrated first in a very small system of just a few logical qubits, as was done in this recently reported advance.

The Conversation (0)