Google’s New Quantum Algorithm May Actually Be Useful

The quantum echoes algorithm shows promise for NMR

4 min read

Charles Q. Choi is a contributing editor for IEEE Spectrum.

Close-up of a quantum computer's high density connects.

Google Quantum AI’s Willow chip is a superconducting quantum computing chip with 105 qubits.

Google Quantum AI

Critics of quantum computers have argued that the supposed advantage these machines have over regular computers have often relied on tests involving pointless tasks. Now, the team at Google Quantum AI has developed a new algorithm that the company says may be used by quantum computers to outperform classical supercomputers on a task that could help discover better drugs, polymers, catalysts, and battery components.

The researchers caution their early experiments on this task have not yet demonstrated a “quantum advantage“ over classical computers, as this preliminary work examined only small molecules that are relatively easy to analyze. However, they say these results do raise hopes that further progress can realize practical benefits.

The Quantum Advantage Controversy

The more components known as qubits are linked within a quantum computer, the greater its computational power can grow, in an exponential fashion. In 2019, Google’s 54-qubit Sycamore quantum computer performed a calculation in 200 seconds that the company estimated would take Summit, the world’s most powerful supercomputer at that time, 10,000 years. Similarly, in 2024, Google’s 105-qubit Willow chip performed a benchmark computation in under five minutes that would take Frontier, the fastest supercomputer then, 10 septillion (or 1025) years.

One problem with the Google team’s claims of a quantum advantage over regular computers has been their use of a benchmark called random circuit sampling. Researchers at Google first had their quantum computers generate sets of results from series of randomly chosen quantum operations. They then statically analyzed how well these results matched what one would expect from perfect quantum computers.

Google’s quantum computers performed better than a conventional supercomputer running classical versions of this task. However, random circuit sampling has no known real-world applications, raising criticisms over whether it was a useful standard. In addition, the random nature of this task meant they could not confirm whether different quantum computers would generate the same results, casting further doubt on the benchmark.

Now Google scientists have developed a quantum algorithm they have nicknamed “quantum echoes.” When the team executed the algorithm on 65 of Willow’s qubits, it performed roughly 13,000 times as fast as the algorithm’s best classical counterpart running on Frontier. Moreover, they say it’s the first quantum algorithm to show a verifiable quantum advantage on quantum computers—two quantum processors running the new algorithm in parallel can get the same results.

“The key aspect of verification is that it can lead to applications,” says Thomas O’Brien, a staff research scientist at Google Quantum AI. “If I can’t prove to you that the data is correct, how can I do anything with it?”

Close-up of Google Quantum AI's Willow chip. Scientists at Google Quantum AI have used their Willow quantum-computing chip to run the quantum echoes algorithm. Google Quantum AI

The Quantum Echoes Algorithm

When the new algorithm runs, it first performs a series of operations on a quantum computer—for instance, simulating a molecule to predict its behavior. The algorithm next perturbs one of the qubits used to carry out those operations. Then, it runs this series of operations in reverse. Finally, it compares the results of both the forward and backward operations.

How is this approach helpful? One problem that conventional supercomputers face when simulating a molecule is modeling the interactions between every part of a compound, a problem that grows increasingly difficult the larger a molecule gets. When this new algorithm applies a slight perturbation to a qubit, this disturbance generates a noticeable impact even on faraway qubits, reminiscent of the so-called butterfly effect, O’Brien notes. By comparing the forward and backward series of operations, the new algorithm can see the effects of this perturbation throughout the molecule and so model the molecule as a whole.

Nobel laureate Michel Devoret, Google Quantum AI’s chief scientist of quantum hardware, notes that Willow’s large number of qubits and its low error rate of about 0.1 percent were key to successfully performing this algorithm. For instance, demonstrating quantum advantage using random circuit sampling in 2019 required that only 0.1 percent of the data gathered was correct, whereas doing so with the new algorithm demanded that only 0.1 percent of the data could be wrong, O’Brien says.

O’Brien notes the new algorithm may get a better picture of one part of a molecule than another depending on which qubit modeling the molecule is perturbed. Future research may try systematically perturbing different qubits running these simulations “to build up many measurements of different molecular distances, which I can use to properly map out my molecule,” he says.

A possible Path to Quantum Advantage

In a new study in the journal Nature, Google says a potential real-world application for this algorithm (formally known as an out-of-time-order correlator) may lie in nuclear magnetic resonance (NMR) spectroscopy, which is essentially magnetic resonance imaging (MRI) for molecules and materials. For instance, in experiments involving up to 15 of Willow’s qubits that coupled the new algorithm with simulations of molecules, the researchers could generate accurate, precise models of molecular structures, findings scheduled to appear 22 October in the ArXiv preprint server.

“As quantum computing continues to advance, such simulations could potentially enhance NMR,” says Ashok Ajoy, an assistant professor of chemistry at the University of California, Berkeley. “While things are still at an early stage, this methodology could yield broad applications in the future, given NMR’s wide use across chemistry, biology, and materials science.”

O’Brien cautions that they have currently only combined their algorithm with relatively small molecular simulations. These results “aren’t beyond classical yet,” O’Brien says. Further improvements to avoid or correct errors may help achieve quantum advantage on practical applications, he notes.

All in all, “we continue to be optimistic that within five years, we will see real-world applications that are only possible with quantum computers,” says Hartmut Neven, the founder and manager of Google Quantum AI.

This story was updated on 22 October 2025 to remove an incorrect statement that Ashok Ajoy was unaffiliated with the research published in Nature.

The Conversation (1)
Martin Maschmann
Martin Maschmann30 Oct, 2025
M

So some fancy algorithm used 15 qubits? You can simulate 15 qubits on a classical computer and a computer good enough to do that would be a Commodore 64