Quantum Computing for Dummies

New guide helps beginners run quantum algorithms on IBM’s quantum computers over the cloud

4 min read

An image of the inside of an IBM quantum computer.

UPDATE 21 Apr. 2024: The drive for more and more accessible quantum computing algorithms and applications has continued with the intensity of a field in its earliest days still. The hardware implementations of qubits and the quantum logic gates that would define their universal operations remain a broadly unsettled question. Superconducting systems and other low-temperature circuits are still popular in 2024 as they were at the time of the story below. But other paradigms—including silicon spin and neutral atom computing—have their applications and advocates too.

Spectrum remains on the lookout not only for prospective new frontiers of quantum computing but also outposts of hype and overblown promise, which we have been skewering too. Earlier this year, Google and the XPrize organization established a $5 million prize to find the most promising new applications for qubit-based computing. Though the Chinese tech giants Alibaba and Baidu seem to have seen enough. Earlier this year, they closed down their quantum computing operations. That said, Beijing—as well as Washington, Brussels, and plenty other governments and labs around the world—press on in the drive to outstrip the conventional computer bit. As Sam Howell from the Center for a New American Security told Spectrum in our Alibaba story, a consensus is emerging that “Maintaining quantum capability is really strategically important, and there is still a desire to move it forward.” —IEEE Spectrum

Original story of 3 July 2022 follows:

Quantum computers may one day rapidly find solutions to problems no regular computer might ever hope to solve, but there are vanishingly few quantum programmers when compared with the number of conventional programmers in the world. Now a new beginner’s guide aims to walk would-be quantum programmers through the implementation of quantum algorithms over the cloud on IBM’s publicly available quantum computers.

Whereas classical computers switch transistors either on or off to symbolize data as ones or zeroes, quantum computers use quantum bits, or “qubits,” which because of the peculiar nature of quantum physics can exist in a state called superposition where they are both 1 and 0 at the same time. This essentially lets each qubit perform two calculations at once. The more qubits are quantum-mechanically linked, or entangled (see our explainer), within a quantum computer, the greater its computational power can grow, in an exponential fashion.

Currently quantum computers are noisy intermediate-scale quantum (NISQ) platforms, meaning their qubits number up to a few hundred at most and are error-ridden as well. Still, quantum processors are widely expected to grow in terms of qubit count and quality, with the aim of achieving a quantum advantage that enables them to find the answers to problems no classical computers could ever solve.

Although the field of quantum programming started in the 1990s, it has to date drawn only a small community. “Programming quantum computers may seem like a great challenge, requiring years of training in quantum mechanics and related disciplines,” says the guide’s senior author, Andrey Lokhov, a theoretical physicist at Los Alamos National Laboratory, in New Mexico. “Additionally, the field is dominated by physics and algebraic notations that at times present unnecessary entry barriers for mainstream computer and mathematically trained scientists.”

Now, with their new guide, Lokhov and his colleagues hope to help pave the way “for the upcoming quantum-computing revolution,” he says. “We believe that our guide fills a missing space in the field of quantum computation, introducing nonexpert computer scientists, physicists, and engineers to quantum algorithms and their implementations on real-world quantum computers.”

The new guide explains the basics of quantum computing and quantum programming, including quantum algorithms.

“Very much like how classical algorithms describe a sequence of instructions that need to be executed on a classical computer, a quantum algorithm represents a step-by-step procedure, where each of the steps needs to be performed on a quantum computer,” Lokhov says. “However, the term ‘quantum algorithm’ is usually reserved for algorithms that contain inherently quantum operations, such as quantum superposition or quantum entanglement, which turn out to be computationally powerful.”

“We believe that our guide fills a missing space in the field of quantum computation, introducing nonexpert computer scientists, physicists, and engineers to quantum algorithms and their implementations on real-world quantum computers.” —Andrey Lokhov, Los Alamos National Laboratory

To implement such quantum operations on quantum computers, quantum programs are represented as circuits describing a sequence of elementary operations, called gates, that are applied on a set of qubits. One major difference between quantum and classical programming lies in a central principle of quantum mechanics—when it comes to measuring a quantum program’s results, the process is inherently probabilistic, or subject to random variation.

“Our guide aims to explain the basic principles of quantum programming, which are quite different from classical programming, with straightforward algebra that makes understanding the underlying fascinating quantum-mechanical principles optional,” Lokhov says. “We have received positive feedback from many scientists—beginners in the field—who were able to quickly familiarize themselves with the basics of quantum programming using our guide.”

The new guide provides the minimal knowledge needed to start implementing and running quantum algorithms right away. These include 20 standard quantum algorithms, including Shor’s algorithm for factoring integers and Grover’s algorithm for database searching.

“In addition, our review covers the most successful hybrid quantum-classical algorithms, such as the quantum approximate optimization algorithm, as well as classical tools that are useful for certifying the performance of quantum algorithms, such as quantum tomography,” Lokhov says. “Hence, the guide surveys a combination of quantum, classical, and hybrid algorithms that are foundational for the field of quantum computing.”

The guide then walks quantum programmers through implementing these algorithms over the cloud on IBM’s publicly available quantum computers, such as its 5-qubit IBMQX4. The guide discusses the results of the implementation and explains differences between the simulator and the actual hardware runs.

Lokhov notes that currently, in order to show that a new quantum algorithm works efficiently, one needs to give a mathematical proof. In contrast, in classical computing, many efficient algorithms were discovered heuristically—that is, by trial and error, or by loosely defined rules—with theoretical guarantees coming much later. The hope is that new quantum algorithms may get discovered in a similar fashion the more quantum programmers there are.

“We believe that our guide could be useful for introducing more scientists to quantum computing and for inviting them to experiment with the forthcoming quantum computers with larger numbers of qubits,” Lokhov says.

The guide appeared online in March in the ACM Transactions on Quantum Computing. You can find the code and implementations that accompany the guide at https://github.com/lanl/quantum_algorithms.

This article appears in the September 2022 print issue as “Quantum Computing for Classical Programmers.”

The Conversation (0)