The December 2022 issue of IEEE Spectrum is here!

Close bar

Quantum Computing for Dummies

New guide helps beginners run quantum algorithms on IBM’s quantum computers over the cloud

3 min read
An image of the inside of an IBM quantum computer.
IBM

Quantum computers may one day rapidly find solutions to problems no regular computer might ever hope to solve, but there are vanishingly few quantum programmers when compared with the number of conventional programmers in the world. Now a new beginner’s guide aims to walk would-be quantum programmers through the implementation of quantum algorithms over the cloud on IBM’s publicly available quantum computers.

Whereas classical computers switch transistors either on or off to symbolize data as ones or zeroes, quantum computers use quantum bits, or “qubits,” which because of the peculiar nature of quantum physics can exist in a state called superposition where they are both 1 and 0 at the same time. This essentially lets each qubit perform two calculations at once. The more qubits are quantum-mechanically linked, or entangled (see our explainer), within a quantum computer, the greater its computational power can grow, in an exponential fashion.

Currently quantum computers are noisy intermediate-scale quantum (NISQ) platforms, meaning their qubits number up to a few hundred at most and are error-ridden as well. Still, quantum processors are widely expected to grow in terms of qubit count and quality, with the aim of achieving a quantum advantage that enables them to find the answers to problems no classical computers could ever solve.

Although the field of quantum programming started in the 1990s, it has to date drawn only a small community. “Programming quantum computers may seem like a great challenge, requiring years of training in quantum mechanics and related disciplines,” says the guide’s senior author, Andrey Lokhov, a theoretical physicist at Los Alamos National Laboratory, in New Mexico. “Additionally, the field is dominated by physics and algebraic notations that at times present unnecessary entry barriers for mainstream computer and mathematically trained scientists.”

Now, with their new guide, Lokhov and his colleagues hope to help pave the way “for the upcoming quantum-computing revolution,” he says. “We believe that our guide fills a missing space in the field of quantum computation, introducing nonexpert computer scientists, physicists, and engineers to quantum algorithms and their implementations on real-world quantum computers.”

The new guide explains the basics of quantum computing and quantum programming, including quantum algorithms.

“Very much like how classical algorithms describe a sequence of instructions that need to be executed on a classical computer, a quantum algorithm represents a step-by-step procedure, where each of the steps needs to be performed on a quantum computer,” Lokhov says. “However, the term ‘quantum algorithm’ is usually reserved for algorithms that contain inherently quantum operations, such as quantum superposition or quantum entanglement, which turn out to be computationally powerful.”

“We believe that our guide fills a missing space in the field of quantum computation, introducing nonexpert computer scientists, physicists, and engineers to quantum algorithms and their implementations on real-world quantum computers.” —Andrey Lokhov

To implement such quantum operations on quantum computers, quantum programs are represented as circuits describing a sequence of elementary operations, called gates, that are applied on a set of qubits. One major difference between quantum and classical programming lies in a central principle of quantum mechanics—when it comes to measuring a quantum program’s results, the process is inherently probabilistic, or subject to random variation.

“Our guide aims to explain the basic principles of quantum programming, which are quite different from classical programming, with straightforward algebra that makes understanding the underlying fascinating quantum-mechanical principles optional,” Lokhov says. “We have received positive feedback from many scientists—beginners in the field—who were able to quickly familiarize themselves with the basics of quantum programming using our guide.”

The new guide provides the minimal knowledge needed to start implementing and running quantum algorithms right away. These include 20 standard quantum algorithms, including Shor’s algorithm for factoring integers and Grover’s algorithm for database searching.

“In addition, our review covers the most successful hybrid quantum-classical algorithms, such as the quantum approximate optimization algorithm, as well as classical tools that are useful for certifying the performance of quantum algorithms, such as quantum tomography,” Lokhov says. “Hence, the guide surveys a combination of quantum, classical, and hybrid algorithms that are foundational for the field of quantum computing.”

The guide then walks quantum programmers through implementing these algorithms over the cloud on IBM’s publicly available quantum computers, such as its 5-qubit IBMQX4. The guide discusses the results of the implementation and explains differences between the simulator and the actual hardware runs.

Lokhov notes that currently, in order to show that a new quantum algorithm works efficiently, one needs to give a mathematical proof. In contrast, in classical computing, many efficient algorithms were discovered heuristically—that is, by trial and error, or by loosely defined rules—with theoretical guarantees coming much later. The hope is that new quantum algorithms may get discovered in a similar fashion the more quantum programmers there are.

“We believe that our guide could be useful for introducing more scientists to quantum computing and for inviting them to experiment with the forthcoming quantum computers with larger numbers of qubits,” Lokhov says.

The guide appeared online in March in the ACM Transactions on Quantum Computing. You can find the code and implementations that accompany the guide at https://github.com/lanl/quantum_algorithms.

This article appears in the September 2022 print issue as “Quantum Computing for Classical Programmers.”

The Conversation (0)

Why Functional Programming Should Be the Future of Software Development

It’s hard to learn, but your code will produce fewer nasty surprises

11 min read
Vertical
A plate of spaghetti made from code
Shira Inbar
DarkBlue1

You’d expectthe longest and most costly phase in the lifecycle of a software product to be the initial development of the system, when all those great features are first imagined and then created. In fact, the hardest part comes later, during the maintenance phase. That’s when programmers pay the price for the shortcuts they took during development.

So why did they take shortcuts? Maybe they didn’t realize that they were cutting any corners. Only when their code was deployed and exercised by a lot of users did its hidden flaws come to light. And maybe the developers were rushed. Time-to-market pressures would almost guarantee that their software will contain more bugs than it would otherwise.

Keep Reading ↓Show less
{"imageShortcodeIds":["31996907"]}