Largest Quantum Computer Calculation to Date—But Is It Too Little Too Late?

D-Wave continues to push the capabilities of its quantum computer, but are the company's capital reserves dwindling?


After erring on the side of caution—if not doubt—when IEEE Spectrum cited D-Wave Systems as one of its “Big Losers” two years ago,  it seems that there was a reversal of opinion within this publication back in June of last year when Spectrum covered D-Wave’s first big sale of a quantum computer with an article and then a podcast interview of the company's CTO.

In the job of covering nanotechnology, one develops—sometimes—a bit more hopeful perspective on the potential of emerging technologies. Basic research that may lead to applications such as quantum computers get more easily pushed up in the development cycle than perhaps they should. So, I have been following the developments of D-Wave for at least the last seven years with a bit more credence than Spectrum had offered the company earlier.

In the continuing expansion of the company’s credibility in the development of quantum computers, D-Wave Systems has published a paper [pdf] in which they demonstrate an 84-qubit calculation of the notoriously difficult to calculate Ramsey numbers.

As the paper published in Cornell University’s arXiv journal states: “This computation is the largest experimental implementation of a scientifically meaningful quantum algorithm that has been done to date.”

The D-Wave researchers were able to complete this calculation in 270 milliseconds. This is a far cry from the much-ballyhooed ability of a quantum computer to factor the number 15.

But as impressive as this may sound, the blog Next Big Future conducted an interview with D-Wave’s CTO Geordie Rose just last month  in which Rose contends that the papers the company publishes are about two years behind where the company actually is in its research.

In Brian Wang's interview with D-Wave’s Rose, there's a discussion of the company’s new 512-qubit chip that should be, according to their calculations, 1000 times faster than the 128-qubit chips that D-Wave is currently working with.

As we learned from Steven Cherry’s podcast with Rose back in June, D-Wave was able to secure the $10-million sale of its hardware and software support so that Lockheed Martin could tackle some of their more difficult optimization problems.

So, it would seem optimization problems have become the measure by which D-Wave calculates how much more effective doubling the number of qubits on a chip can be in solving problems.

From the Next Big Future piece: “One application of the D-Wave system is for the optimization problem of creating treatment plans for cancer radiation treatment based on a 3D body scan. This treatment plan takes 1 week using the 128-qubit system but minutes with the 512-qubit system.”

While it may seem that D-Wave is on irreversible upward technological slope, one problem indicated in the Next Big Future interview is that capital may be beginning to dry up.

If so, it would seem almost ironic that after years of not selling anything and attracting a lot of capital, D-Wave would make a $10-million sale and then not be able to get any more funding.

But alas this is the sometimes topsy-turvy world of applying capital at the right time and in the right amount to emerging technologies.

This article was edited on 12 January 2012.

The Semiconductors Newsletter

Monthly newsletter about how new materials, designs, and processes drive the chip industry.

About the Nanoclast blog

IEEE Spectrum’s nanotechnology blog, featuring news and analysis about the development, applications, and future of science and technology at the nanoscale.