Novel Error Correction Code Opens a New Approach to Universal Quantum Computing

A British scientist working in Australia has found a way to apply a three-dimensional code to a two-dimensional framework for quantum error correcting

3 min read
Illustration of a quantum computer
Illustration: Shutterstock

Government agencies and universities around the world—not to mention tech giants like IBM, Google, and Microsoft—are vying to be the first to answer a trillion-dollar quantum question: How can quantum computers reach their vast potential when they are still unable to consistently produce results that are reliable and free of errors? 

Every aspect of these exotic machines—including their fragility and engineering complexity; their preposterously sterile, low-temperature operating environment; complicated mathematics; and their notoriously shy quantum bits (qubits) that flip if an operator so much as winks at them—are all potential sources of errors. It says much for the ingenuity of scientists and engineers that they have found ways to detect and correct these errors and have quantum computers working to the extent that they do—at least long enough to produce limited results before errors accumulate and quantum decoherence of the qubits kicks in.

“My approach to suppressing errors could free up a lot of the hardware from error correction and will allow the computer to get on with doing useful stuff.”

When it comes to correcting errors arising during quantum operations, an error-correction method known as the surface code has drawn a lot of research attention. That’s because of its robustness and the fact that it’s well suited to being set out on a two-dimensional plane (which makes it amenable to being laid down on a chip). The surface code uses the phenomenon known as entanglement (quantum connectivity) to enable single qubits to share information with other qubits on a lattice layout. The benefit: When qubits are measured, they reveal errors in neighboring qubits.

For a quantum computer to tackle complicated tasks, error-correction codes need to be able to perform quantum gate operations; these are small logic operations carried out on qubit information that, when combined, can run algorithms. Classical computing analogues would be AND gates, XOR gates, and the like. 

Physicists describe two types of quantum gate operations (distinguished by their different mathematical approaches) that are necessary to achieve universal computing. One of these, the Clifford gate set, must work in combination with magic-state distillation—a purification protocol that uses multiple noisy quantum states to perform non-Clifford gate operations.

“Without magic-state distillation or its equivalent, quantum computers are like electronic calculators without the division button; they have limited functionality,” says Benjamin Brown, an EQUS  researcher at the University of Sydney’s School of Physics. “However, the combination of Clifford and non-Clifford gates can be prohibitive because it eats up so much of a quantum computer’s resources that there’s little left to deal with the problem at hand.”

To overcome this problem, Brown has developed a new type of non-Clifford-gate error-correcting method that removes the need for overhead-heavy distillation. A paper he published on this development appeared in Science Advances on 22 May. 

“Given it is understood to be impossible to use two-dimensional code like the surface code to do the work of a non-Clifford gate, I have used a three-dimensional code and applied it to the physical two-dimensional surface code scheme using time as the third dimension,” explains Brown. “This has opened up possibilities we didn’t have before.”

This diagram illustrates the staged progression of one surface code being slid underneath the other two surface codes over time. This diagram illustrates the staged progression of one surface code being slid underneath the other two surface codes over time. The three codes interact together during each step as the bottom code is passed under the other two to produce the two-dimensional gate. Image: University of Sydney

The non-Clifford gate uses three overlapping copies of the surface code that locally interact over a period of time. This is carried out by taking thin slices of the 3D surface code and collapsing them down into a 2D space. The process is repeated over and over on the fly with the help of just-in-time gauge fixing, a procedure for stacking together the two-dimensional slices onto a chip, as well as dealing with any occurring errors. Over a period of time, the three surface codes replicate the three-dimensional code that can perform the non-Clifford gate function(s).

“I’ve shown this to work theoretically, mathematically,” says Brown. “The next step is to simulate the code and see how well it works in practice.”

Michael Beverland, a senior researcher at Microsoft Quantum commented on the research: “Brown’s paper explores an exciting, exotic approach to perform fault-tolerant quantum computation. It points the way towards potentially achieving universal quantum computation in two spatial dimensions without the need for distillation—something many researchers thought was impossible.”

Brown notes that reducing errors in quantum computing is one of the biggest challenges facing scientists before machines capable of solving useful problems can be built. “My approach to suppressing errors could free up a lot of the hardware from error correction and will allow the computer to get on with doing useful stuff.”

This post was updated on 3 June 2020. 

The Conversation (0)

The Future of Deep Learning Is Photonic

Computing with light could slash the energy needs of neural networks

10 min read

This computer rendering depicts the pattern on a photonic chip that the author and his colleagues have devised for performing neural-network calculations using light.

Alexander Sludds

Think of the many tasks to which computers are being applied that in the not-so-distant past required human intuition. Computers routinely identify objects in images, transcribe speech, translate between languages, diagnose medical conditions, play complex games, and drive cars.

The technique that has empowered these stunning developments is called deep learning, a term that refers to mathematical models known as artificial neural networks. Deep learning is a subfield of machine learning, a branch of computer science based on fitting complex models to data.

Keep Reading ↓ Show less