The December 2022 issue of IEEE Spectrum is here!

Close bar

How Much Power Will Quantum Computing Need?

The new 1,000-qubit machine installed at Google's Quantum AI Lab spends most of its power on keeping cool

3 min read
How Much Power Will Quantum Computing Need?
Photo: D-Wave Systems

Google’s Quantum AI Lab has installed the latest generation of what D-Wave Systems describes as the world’s first commercial quantum computers. Quantum computing could potentially solve certain problems much faster than today’s classical computers while using comparatively less power to perform the calculations. Yet the energy efficiency of quantum computing still remains a mystery.

For now, D-Wave’s machines can scale up the number of quantum bits (qubits) they use without significantly increasing their power requirements. That’s because D-Wave’s quantum computing hardware relies on a specialized design consisting of metal niobium loops that act as superconductors when chilled to a frigid 15 millikelvin (-273°  C). Much of the D-Wave hardware’s power consumption—slightly less than 25 kilowatts for the latest machine—goes toward running the refrigeration unit that keeps the quantum processor cool. The quantum processor itself requires a comparative pittance.

“The operation of the quantum processor itself requires remarkably little power—only a tiny fraction of a microwatt—which is essentially negligible in comparison to the power needs of the refrigerator and servers,” says Colin Williams, director of business development & strategic partnerships at D-Wave Systems.

The new 1 000-qubit D-Wave 2X machine installed at Google’s lab has about double the qubits of its predecessor, the D-Wave Two machine. But the minimal amount of power used by the quantum processor means that “the total system power will still remain more or less constant for many generations to come” even as the quantum processor scales up to thousands of qubits, Williams says.

D-Wave can currently get away with this because the same “cryostat” unit that uses so many kilowatts of power would still be sufficient to cool much larger quantum processors than the ones currently in use. 

"It would be similar if you attach a large cooling device to your PC that uses many kilowatts of power—you would barely see an increase in power consumption when going to larger systems since the power is dominated by the large cooling infrastructure," says Matthias Troyera computational physicist at ETH Zurich.

The ability to scale up a D-Wave machine’s computing capabilities without increasing its power consumption may sound promising. But it actually doesn’t say much about the power efficiency of quantum computing compared with classical computing. Today’s D-Wave machines perform about as well as a high-end PC on certain specific tasks, but they use far more power based on their extreme cooling requirements. (High-end computing cores require just tens of watts of power.)

“While the flat power requirement’ is a good statement to make for marketing, it is unclear at the moment what the true power needs are once the device is optimized and scaled up,” Troyer says. “Right now they need orders of magnitude more power than competing classical technology.”

However, it’s not exactly a fair comparison, Troyer says. “On the power side, they are currently losing,” he says. But the D-Wave machine “is not engineered to be power saving. It may pay off again at some point.”

Scott Aaronson, a theoretical computer scientist at MIT and a D-Wave critic, seemed bemused by the idea of D-Wave having a power advantage of any sort. Referring to D-Wave’s reliance on a crygenic cooler he wrote in an email: “It’s amusing chutzpah to take such a gigantic difficulty and then present it as a feature.” He pointed out that D-Wave might need an even more power-hungry cooling system to create lower temperatures that improve its quantum processors’ chances of a “speedup” advantage over classical computing in the future.

D-Wave’s quantum annealing machines represent just one possible computer architecture for quantum computing. They’re designed to solve a specialized set of “optimization problems” rather than act as universal logic-gate quantum computers. (The latter would be super-fast versions of today’s classical “gate-model” computers.) Google’s Quantum AI Lab has invested in both D-Wave’s machines and in exploring development of universal logic-gate quantum computers.

In the end, Troyer expects power requirements for quantum computing to probably be “linearly proportional” to the number of qubits and their couplings, as well as proportional to the number of times operators must run and recool the system before it finds the solution.

Quantum computing’s big advantages probably won’t begin to emerge until engineers build machines with many thousands or possibly millions of qubits. That’s still a ways off even for D-Wave, which has chosen to scale up the number of qubits in its processors fairly quickly. Most quantum computing researchers have opted for a much slower approach of building quantum computing devices with just several qubits or tens of qubits, because of major challenges in correcting for qubit errors and maintaining coherence across the system.

Still, both D-Wave and independent quantum computing labs share the same general goal of building machines that can exploit the “spooky physics” of quantum physics. Quantum computers could potentially perform many more calculations at the same time than classical machines. If quantum computers can beat classical computers in terms of “time to solution,” they could also prove more power-efficient at the end of the day. 

“If a quantum device can solve a problem with much better [time to solution] scaling than classical computing, it would also win on power," Troyer says.

The Conversation (0)

Why Functional Programming Should Be the Future of Software Development

It’s hard to learn, but your code will produce fewer nasty surprises

11 min read
A plate of spaghetti made from code
Shira Inbar

You’d expectthe longest and most costly phase in the lifecycle of a software product to be the initial development of the system, when all those great features are first imagined and then created. In fact, the hardest part comes later, during the maintenance phase. That’s when programmers pay the price for the shortcuts they took during development.

So why did they take shortcuts? Maybe they didn’t realize that they were cutting any corners. Only when their code was deployed and exercised by a lot of users did its hidden flaws come to light. And maybe the developers were rushed. Time-to-market pressures would almost guarantee that their software will contain more bugs than it would otherwise.

Keep Reading ↓Show less