The August 2022 issue of IEEE Spectrum is here!

Close bar

Moving Chips Closer to Cold Qubits

Conventional electronics join quantum circuits in the deep freeze

1 min read
Image of two people holding electronics.
Photo: Microsoft

Three of the biggest companies making quantum computers today—Google, Intel, and Microsoft—are betting on supercooled devices operating at close to absolute zero. But there’s a problem: These cold cathedrals of quantum computing cannot tolerate the extra heat given off by the conventional computing chips that control them.

This means the classical and quantum-­computing components must be separated, despite their marriage by design. The control chips usually reside at room temperature on top of the quantum-computing stack, while the quantum bits (qubits) remain in the coldest depths of dilution refrigerators. The dilution fridges involve helium-3 and helium-4 isotopes to supercool the environment, lowering temperatures from a baseline of 4 kelvins (–269.15 ºC) at the top to about 10 millikelvins at the bottom.

Cables running up and down the hardware stack connect each qubit with its control chip and other conventional computing components higher up. Such unwieldy setups with just dozens of qubits would become an “engineering nightmare” if scaled up to the number of qubits necessary for practical quantum computing, says Fabio Sebastiano, research lead for the quantum-computing division at QuTech, in Delft, Netherlands. He compared the approach to trying to connect each of the 10 million pixels in a smartphone camera to their readout electronics using 1-meter cables.

That is why these three tech giants have been developing either qubits that operate at warmer temperatures or control chips that operate at colder temperatures—while minimizing heat from power dissipation. The companies hope to shrink the operating temperature difference and possibly unite classical and quantum-computing components in the same integrated chips or packages.

This article appears in the May 2021 print issue as “Fast, Cold, and Under Control.”

The Conversation (0)

Quantum Error Correction: Time to Make It Work

If technologists can’t perfect it, quantum computers will never be big

13 min read
Quantum Error Correction: Time to Make It Work
Chad Hagen
Blue

Dates chiseled into an ancient tombstone have more in common with the data in your phone or laptop than you may realize. They both involve conventional, classical information, carried by hardware that is relatively immune to errors. The situation inside a quantum computer is far different: The information itself has its own idiosyncratic properties, and compared with standard digital microelectronics, state-of-the-art quantum-computer hardware is more than a billion trillion times as likely to suffer a fault. This tremendous susceptibility to errors is the single biggest problem holding back quantum computing from realizing its great promise.

Fortunately, an approach known as quantum error correction (QEC) can remedy this problem, at least in principle. A mature body of theory built up over the past quarter century now provides a solid theoretical foundation, and experimentalists have demonstrated dozens of proof-of-principle examples of QEC. But these experiments still have not reached the level of quality and sophistication needed to reduce the overall error rate in a system.

Keep Reading ↓Show less
{"imageShortcodeIds":["29986363","29986364"]}