Measuring the progress of quantum computers can prove tricky in the era of “noisy” quantum computing technology. One concept, known as “quantum volume,” has become a favored measure among companies such as IBM and Honeywell. But not every company or researcher agrees on its usefulness as a yardstick in quantum computing.
In an ideal world, researchers could measure progress in quantum computing based on the number of quantum bits (qubits) in each system. But noise in the form of heat or electromagnetic sources constantly threatens to disrupt the computations among the fragile qubits, which makes it hard to reliably measure a quantum computer’s capabilities based only upon the total number of qubits. That is why IBM researchers proposed the concept of quantum volume as a more reliable measure during this imperfect stage of quantum computing technology.
“Think of quantum volume as the average of worst-case circuits run on any quantum computer,” says Jay Gambetta, a research fellow and vice president in quantum computing at IBM. “The result means that if this ‘worst case’ is possible, quantum volume is a measure of the circuits’ quality; the higher the quality, the more complex circuits can be run on a quantum computer.”
More specifically, IBM’s team defines quantum volume as 2 to the power of the size of the largest circuit with equal width and depth that can pass a certain reliability test involving random two-qubit gates, says Daniel Lidar, director of the Center for Quantum Information Science and Technology at the University of Southern California in Los Angeles. The circuit’s size is defined by either width based on the number of qubits or depth based on the number of gates, given that width and depth are equal in this case.
That means a 6-qubit quantum computing system would have a quantum volume of 2 to the power of 6, or 64—but only if the qubits were relatively free of noise and the potential errors that can accompany such noise. (This is why the reliability test matters for the quantum volume definition.)
Lidar, who was not involved with coining the quantum volume concept, sees it as a useful measure for today’s quantum computers that are described as Noisy Intermediate-Scale Quantum (NISQ) technology. “Such a metric is an excellent way to capture the performance of NISQ-era quantum computers, which define the era where noise still plays an important limiting factor in attaining high circuit depth with reliable performance,” Lidar says.
Since IBM began publicizing the term more starting in late 2019, quantum volume has come up a number of times in the quantum computing papers and press releases of IBM and other companies such as Honeywell. But there is already at least one tech company CEO floating the idea that the end of quantum volume’s usefulness might be in sight.
While discussing IonQ’s latest quantum computing developments in an Ars Technicainterview, CEO Peter Chapman talked about how improvements in the reduction of noise could effectively lead to a high-fidelity, 32-qubit system with a quantum volume of approximately 4 million. Within 18 months, he suggested, quantum volume numbers could grow so large that researchers might need to rethink the definition of quantum volume to retain its usefulness.
But Lidar disagrees that quantum volume is headed for relatively swift obsolescence. He points out that the quantum volume number would only grow so fast because of the “2 to the power” part of the definition. In fact, he adds, IBM did not even define quantum volume with the “2 to the power” part in its first paper on the subject [PDF] back in 2017. “This is purely an artifact of this definition,” Lidar says.
The simplest solution would be to define quantum volume in accordance with the largest number of qubits or gates, instead of using “2 to the power” of those numbers, Lidar says.
Not everyone sees quantum volume as a hugely important or necessary for benchmarking the progress of quantum computing. It’s not clear if distilling quantum computing progress into a single measure is even necessary, says Scott Aaronson, a computer scientist and director of the Quantum Information Center at the University of Texas at Austin. He raised this and other questions in a blog post titled “Turn Down the Quantum Volume.”
“It's basically just one possible ‘gross consumer index of quantum computer awesomeness,’ among countless alternatives that could be defined,” Aaronson says.
For practical purposes, it’s mostly big quantum computing industry players such as IBM that are currently concerned about quantum volume, says Javad Shabani, an assistant professor of physics and chair of the Shabani Lab at New York University. That’s because he and other academic researchers generally don’t have hardware access to such large quantum computing systems, even if more companies are offering cloud-based access to such systems for programming purposes.
Still, Shabani sees quantum volume as a useful concept that defines quantum computing progress in a more meaningful way than simply counting qubits. Like Lidar, he suggests that quantum volume will remain relevant as long as noise remains a limiting factor for quantum computers—whether that is the case for the next five years or the next decade or more.
“If you can make a logical qubit that basically has no noise, then slowly this quantum volume thing will go away naturally,” Shabani says.
Jeremy Hsu has been working as a science and technology journalist in New York City since 2008. He has written on subjects as diverse as supercomputing and wearable electronics for IEEE Spectrum. When he’s not trying to wrap his head around the latest quantum computing news for Spectrum, he also contributes to a variety of publications such as Scientific American, Discover, Popular Science, and others. He is a graduate of New York University’s Science, Health & Environmental Reporting Program.