Quantum Computing’s Hard, Cold Reality Check

Hype is everywhere, skeptics say, and practical applications are still far away

6 min read
3D illustration of a quantum computer floating with a backdrop of stars and colorful clouds
iStock

The quantum computer revolution may be further off and more limited than many have been led to believe. That’s the message coming from a small but vocal set of prominent skeptics in and around the emerging quantum computing industry.

Quantum computers have been touted as a solution to a wide range of problems, including financial modeling, optimizing logistics, and accelerating machine learning. Some of the more ambitious timelines proposed by quantum computing companies have suggested these machines could be impacting real-world problems in just a handful of years. But there’s growing pushback against what many see as unrealistic expectations for the technology.

Meta’s LeCun—Not so fast, qubit

Meta’s head of AI research Yann LeCun recently made headlines after pouring cold water on the prospect of quantum computers making a meaningful contribution in the near future. Speaking at a media event celebrating the 10-year anniversary of Meta’s Fundamental AI Research team he said the technology is “a fascinating scientific topic,” but that he was less convinced of “the possibility of actually fabricating quantum computers that are actually useful.”

While LeCun is not an expert in quantum computing, leading figures in the field are also sounding a note of caution. Oskar Painter, head of quantum hardware for Amazon Web Services, says there is a “tremendous amount of hype” in the industry at the minute and “it can be difficult to filter the optimistic from the completely unrealistic.”

A fundamental challenge for today’s quantum computers is that they are very prone to errors. Some have suggested that these so-called “noisy intermediate-scale quantum” (NISQ) processors could still be put to useful work. But Painter says there’s growing recognition that this is unlikely and quantum error-correction schemes will be key to achieving practical quantum computers.

“We found out over the last 10 years that many things that people have proposed don’t work. And then we found some very simple reasons for that.”
—Matthias Troyer, Microsoft

The leading proposal involves spreading information over many physical qubits to create “logical qubits” that are more robust, but this could require as many as 1,000 physical qubits for each logical one. Some have suggested that quantum error correction could even be fundamentally impossible, though that is not a mainstream view. Either way, realizing these schemes at the scale and speeds required remains a distant goal, Painter says.

“Given the remaining technical challenges in realizing a fault-tolerant quantum computer capable of running billions of gates over thousands of qubits, it is difficult to put a timeline on it, but I would estimate at least a decade out,” he says.

Microsoft—Clarity, please

The problem isn’t just one of timescales. In May, Matthias Troyer, a technical fellow at Microsoft who leads the company’s quantum computing efforts, co-authored a paper in Communications of the ACM suggesting that the number of applications where quantum computers could provide a meaningful advantage was more limited than some might have you believe.

“We found out over the last 10 years that many things that people have proposed don’t work,” he says. “And then we found some very simple reasons for that.”

The main promise of quantum computing is the ability to solve problems far faster than classical computers, but exactly how much faster varies. There are two applications where quantum algorithms appear to provide an exponential speed up, says Troyer. One is factoring large numbers, which could make it possible to break the public key encryption the internet is built on. The other is simulating quantum systems, which could have applications in chemistry and materials science.

Quantum algorithms have been proposed for a range of other problems including optimization, drug design, and fluid dynamics. But touted speedups don’t always pan out—sometimes amounting to a quadratic gain, meaning the time it takes the quantum algorithm to solve a problem is the square root of the time taken by its classical counterpart.

Troyer says these gains can quickly be wiped out by the massive computational overhead incurred by quantum computers. Operating a qubit is far more complicated than switching a transistor and is therefore orders of magnitude slower. This means that for smaller problems, a classical computer will always be faster, and the point at which the quantum computer gains a lead depends on how quickly the complexity of the classical algorithm scales.

Operating a qubit is far more complicated than switching a transistor and is therefore orders of magnitude slower.

Troyer and his colleagues compared a single Nvidia A100 GPU against a fictional future fault-tolerant quantum computer with 10,000 “logical qubits” and gates times much faster than today’s devices. Troyer says they found that a quantum algorithm with a quadratic speed up would have to run for centuries, or even millenia, before it could outperform a classical one on problems big enough to be useful.

Another significant barrier is data bandwidth. Qubits’ slow operating speeds fundamentally limit the rate at which you can get classical data in and out of a quantum computer. Even in optimistic future scenarios this is likely to be thousands or millions of times slower than classical computers, says Troyer. That means data-intensive applications like machine learning or searching databases are almost certainly out of reach for the foreseeable future.

The conclusion, says Troyer, was that quantum computers will only really shine on small-data problems with exponential speed ups. “All the rest is beautiful theory, but will not be practical,” he adds.

The paper didn’t make much of an impact in the quantum community, says Troyer, but many of Microsoft customers were grateful to get some clarity on realistic applications for quantum computing. He says they’ve seen a number of companies downsize or even shutdown their quantum computing teams, particularly in the finance sector.

Aaronson—Welcome, skeptics

These limitations shouldn’t really be a surprise to anyone who has been paying close attention to quantum computing research, says Scott Aaronson, a professor of computer science at the University of Texas at Austin. “There are these claims about how quantum computing will revolutionize machine learning and optimization and finance and all these industries, where I think skepticism was always warranted,” he says. “If people are just now coming around to that, well then, welcome.”

While he also thinks practical applications are still a long way off, recent progress in the field has actually given him cause for optimism. Earlier this month researchers from quantum computing startup QuEra and Harvard demonstrated that they could use a 280 qubit processor to generate 48 logical qubits–far more than previous experiments have managed. “This was definitely the biggest experimental advance maybe for several years,” says Aaronson.

“When you say quantum is going to solve all the world’s problems, and then it doesn’t, or it doesn’t right now, that creates a little bit of a letdown.”
—Yuval Boger, QuEra

Yuval Boger, chief marketing officer at QuEra, is keen to stress that the experiment was a lab demonstration, but he thinks the results have caused some to reassess their timescales for fault-tolerant quantum computing. At the same time though, he says they have also noticed a trend of companies quietly shifting resources away from quantum computing.

This has been driven, in part, by growing interest in AI since the advent of large language models, he says. But he agrees that some in the industry have exaggerated the near-term potential of the technology, and says the hype has been a double-edged sword. “It helps get investments and get talented people excited to get into the field,” he says. “But on the other hand, when you say quantum is going to solve all the world’s problems, and then it doesn’t, or it doesn’t right now, that creates a little bit of a letdown.”

Even in the areas where quantum computers look most promising, the applications could be narrower than initially hoped. In recent years, papers from researchers at scientific software company Schrödinger and a multi-institutional team have suggested that only a limited number of problems in quantum chemistry are likely to benefit from quantum speedups.

Merck KGaA—Lovely accelerator, sometimes

It’s also important to remember that many companies already have mature and productive quantum chemistry workflows that operate on classical hardware, says Philipp Harbach, global head of group digital innovation at German pharma giant Merck KGaA, in Darmstadt, Germany (not to be confused with the American company Merck).

“In the public, the quantum computer was portrayed as if it would enable something not currently achievable, which is inaccurate,” he says. “Primarily, it will accelerate existing processes rather than introducing a completely disruptive new application area. So we are evaluating a difference here.”

Harbach’s group has been investigating the relevance of quantum computing to Merck’s work for about six years. While NISQ devices may potentially have uses for some certain highly specialized problems, they’ve concluded that quantum computing will not have a significant impact on industry until fault-tolerance is achieved. Even then, how transformative that impact could be really depends on the specific use case and products a company is working on, says Harbach.

Quantum computers shine at providing accurate solutions to problems that become intractable at larger scales for classical computers. That could be very useful for some applications, such as designing new catalysts, says Harbach. But most of the chemistry problems Merck is interested in involve screening large numbers of candidate molecules very quickly.

“Most problems in quantum chemistry do not scale exponentially, and approximations are sufficient,” he says. “They are well behaved problems, you just need to make them faster with increased system size.”

Nonetheless, there can still be cause for optimism, says Microsoft’s Troyer. Even if quantum computers can only tackle a limited palette of problems in areas like chemistry and materials science, the impact could still be game-changing. “We talk about the Stone Age and the Bronze Age, and the Iron Age, and the Silicon Age, so materials have a huge impact on mankind,” he says.

The goal of airing some skepticism, Troyer says, is not to diminish interest in the field, but to ensure that researchers are focused on the most promising applications of quantum computing with the greatest chance of impact.


UPDATE: 28 Dec. 2023: This story was updated to clarify the fact that Matthias Troyer, quoted above, has seen downsizing in the financial sector but not the life sciences sector—both of which Spectrum had originally, erroneously reported .

The Conversation (6)
Dale Hitchcock
Dale Hitchcock04 Jan, 2024
LM

I find it interesting that most of those noted in this article weren't over optimistic on quantum computing yet about a month or so ago executives from IBM were featured on a CBS 60 Minutes segment in which they had just the opposite attitude. In the 60 Minutes segment they went so far as to imply that by 2030 they thought they would have developed a usable quantum computer which would rush in an area of AI hereto of unheard. More interesting is the latest take just published in a Jan 4, 2024 MIT Tech Review.

2 Replies
Philip Machanick
Philip Machanick04 Jan, 2024
LS

Spot the irony…

AI person accusing another field of “hype”.

I first became interested in AI c. 1980 and became disillusioned with its history of over-promise, under-deliver. That exactly is where quantum computing is now.

Can quantum computing turn that around? Maybe.

But remember, today’s AI is nowhere close to the early claims of modelling human intelligence. The goals moved sideways – we now have deep learning models that derive from machine learning, not from classic symbolic AI, which is as far from delivering now as it was 40 years ago.

Can quantum move sideways? But where?

Igo Ezequiel
Igo Ezequiel24 Dec, 2023
INDV

IYH Article should mention the premier reasoned unanswered critic of the QC enterprise Gil Kalai who already ten years ago pointed out the fatal flaws: noise, error correction & sync issues insurmountable (and Leonard Levin from BU before him in 2003 https://t.co/Xux2Z8OMJn )

Short post w cartoons for laymen https://gilkalai.wordpress.com/2017/10/16/if-quantum-computers-are-not-possible-why-are-classical-computers-possible/

video longer explanation (part II)

https://www.youtube.com/watch?v=aSn9m1m8PW8DYB