The December 2022 issue of IEEE Spectrum is here!

Close bar

IBM Simulates a 56-Qubit Machine

A supercomputer surpasses the proposed limit of using conventional machines to simulate quantum computers

3 min read
A close-up image of the jumble of wires and cables within a quantum computer that IBM is building.
Image: IBM Research

Quantum computers can, in theory, vastly outperform conventional computers using components known as qubits. Now IBM says it has simulated a 56-qubit quantum computer on an old-fashioned supercomputer, a task some had previously suggested was beyond the capabilities of conventional machines.

These findings do not mean that Google and others should abandon their quantum computer projects, the researchers add. Instead, they suggest that conventional supercomputers could help make sure quantum computers actually work by double-checking their results.

Classical computers flick transistors either on or off to symbolize data as ones and zeroes. In contrast, quantum computers use quantum bits or qubits that, because of the surreal nature of quantum physics, can be in a state of superposition where they can essentially act as both 1 and 0.

The superpositions that qubits adopt let them each perform two calculations at once. If two qubits are quantum-mechanically linked, or entangled, they can help perform 2^2 or four calculations simultaneously; three qubits, 2^3 or eight calculations; and so on. In principle, a quantum computer with 300 qubits could perform more calculations in an instant than there are atoms in the visible universe.

"Eventually quantum computers will get to so many qubits that conventional computers can't catch up," says study lead author Ed Pednault at IBM's Thomas J. Watson Research Center in Yorktown Heights, New York.

Previous research suggested that at roughly 50 qubits, quantum computers would achieve "quantum supremacy," solving problems beyond the practical limits of conventional machines, in terms of either computational complexity, available memory, or both. By the end of 2017, Google hopes to make a 49-qubit chip in a push toward quantum supremacy.

Now IBM says it has simulated a 56-qubit quantum computer using the Vulcan supercomputer at Lawrence Livermore National Laboratory in California. The scientists detailed their findings Oct. 16 in the ArXiv preprint server.

Whereas a 56-qubit quantum computer can theoretically perform 2^56 operations simultaneously, IBM's accomplishment involved dividing this task into 2^19 slices that each essentially consisted of 2^36 operations. This strategy meant the researchers only needed about 3 terabytes of memory for their simulated quantum computer. In contrast, earlier in 2017, a 45-qubit simulation at the Swiss Federal Institute of Technology in Zurich required 500 terabytes of memory.

"During our work, we were able to consistently accomplish things we thought impossible a week earlier," says study senior author Robert Wisnieff at IBM's Thomas J. Watson Research Center. "It was like having your birthday every day you came to work."

The IBM researchers stressed these kinds of simulated quantum computers are not meant to replace quantum computers. For instance, whereas a perfect 56-qubit quantum computer can perform the experiments "in 100 microseconds or less, we took two days, so a factor of a billion times slower," Wisnieff says.

Instead of deflating experiments aimed at achieving quantum supremacy, such as Google's, "this sort of simulation is actually necessary to verify the type of experiment they're planning," says theoretical computer scientist Scott Aaronson at the University of Texas at Austin, who did not take part in this research.

By simulating quantum computers, conventional supercomputers can double-check the results of actual quantum computers to see if they are working properly. "We're at the point where we're able to start fabricating machines on the order of 50 qubits, but we know they're not at all ideal in their behavior," Wisnieff says. If the results from quantum computers fail to match those of simulations, researchers know they may have something to fix.

Simulated quantum computers can also help researchers explore the best applications for actual quantum computers by finding out which problems they solve better than conventional machines, Wisnieff says. Simulating errors in quantum computers could also help scientists figure out the causes of problems in actual quantum computers, he adds.

It remains uncertain what the limit is for how many qubits conventional machines can simulate. "At this point, we don't know exactly how far we can go," Wisnieff says. "But it's just a matter of time before quantum computers ultimately win."

The Conversation (0)

Why Functional Programming Should Be the Future of Software Development

It’s hard to learn, but your code will produce fewer nasty surprises

11 min read
Vertical
A plate of spaghetti made from code
Shira Inbar
DarkBlue1

You’d expectthe longest and most costly phase in the lifecycle of a software product to be the initial development of the system, when all those great features are first imagined and then created. In fact, the hardest part comes later, during the maintenance phase. That’s when programmers pay the price for the shortcuts they took during development.

So why did they take shortcuts? Maybe they didn’t realize that they were cutting any corners. Only when their code was deployed and exercised by a lot of users did its hidden flaws come to light. And maybe the developers were rushed. Time-to-market pressures would almost guarantee that their software will contain more bugs than it would otherwise.

Keep Reading ↓Show less
{"imageShortcodeIds":["31996907"]}