Superconductor Logic Goes Low-Power

Energy-efficient superconducting circuits could be key to future supercomputers

Photo: Hypres

Cold Logic: New logic designs could make superconductors superefficient.

Transistor-based semiconductors have dominated the computing industry since its start. But a much more exotic, transistor-less option has long been lurking in the wings. Superconducting circuits, which boast resistance-less wires and ultrafast switches, can perform the tasks that silicon-based systems do in a fraction of the time.

Now new logic designs are emerging that suggest superconducting processors could be not only faster but also tens or even hundreds of times as energy efficient as their CMOS cousins. And these processors could provide a much-needed path to the next generation of supercomputers, proponents say.

This next generation, called exaflop computers, would be capable of executing a quintillion (1018) operations per second, about 1000 times as many as existing computers can. Once thought to be just 5 or 10 years away, they now seem nearly impossible. A recent estimate suggests that an exascale supercomputer built using CMOS technology would consume some 500 megawatts—the output of a modest nuclear power plant. "What everybody's shooting for is to be able to overturn [that] result," says Erik DeBenedictis of Sandia National Laboratories, in Albuquerque. "Now there's a glimmer of light that it might happen."

Superconducting circuits have long been an attractive option for ultrafast processors. Chilled down to a few degrees above absolute zero, superconducting logic gates can perform operations in picoseconds with less than a microwatt of power. Simple superconducting logic circuits have been shown to operate at speeds of up to 770 gigahertz.

But the technology has been slow to make its way into complex circuits. Since the early 1990s, most superconducting circuits have been built using a design called rapid single-flux quantum (RSFQ) logic, which relays bits of information in the form of short voltage pulses carried by tiny, speeding vortices of current.

RSFQ has been used to build a number of specialized devices needed for high-throughput and numerically intensive applications, such as communications receivers and signal processing. But the design consumes too much power to be scaled up to processors that could compete with CMOS chips in high-end computers. To distribute current among gates, RSFQ relies on a network of bias resistors that can consume 10 times as much power as superconducting logic uses for computation.

In an attempt to build circuits that are more energy friendly, Quentin Herr and his colleagues at Northrop Grumman Systems Corp., in Baltimore, took aim at these resistors. "We ended up changing almost all the characteristics of the logic," Herr says.

To eliminate the bias resistors, the team members switched the power source of the circuit from DC to AC, which allowed them to replace the resistors with transformers that don't draw power when the circuit isn't performing computations. As in RSFQ, logic pulses stream through the circuit, where they are blocked, passed, or rerouted by sandwiches of superconducting material and insulator called Josephson junctions. The presence or absence of a pulse within a given time period determines whether a bit is a 0 or a 1.

By using AC, the team could send pulses in pairs of opposite voltage—the inverted humps of a sine wave. While the first pulse was used for computation, the second was used to reset the circuit. This allowed the Northrop Grumman engineers to simplify the design, cutting down on the number of Josephson junctions needed in each logic gate. The results of tests on some basic logic circuits were published online in May in the Journal of Applied Physics.

Herr reckons that circuits that use the new design, which his team calls reciprocal quantum logic (RQL), will require 1/300th the power of the most advanced CMOS circuits. Crucially, that estimate includes the power needed to cool the circuits to superconducting temperatures.

Others are betting that improvements to RSFQ logic could produce similar gains in energy efficiency. At Hypres, in Elmsford, N.Y., Oleg Mukhanov and his colleagues have found they could eliminate a huge power loss by simply replacing bias resistors with sets of inductors and Josephson junctions. The team published its approach in June in IEEE Transactions on Applied Superconductivity.

Hypres does not have circuits that can be directly compared to Northrop Grumman's, but the company anticipates that its design will rival RQL in power efficiency. Mukhanov says he expects the new RSFQ design will also be easier to scale up to more complex circuits than the RQL scheme, which requires tighter timing tolerances.

But RQL could still be the better technology when it comes to energy consumption, says Mikhail Dorojevets, an associate professor of electrical and computer engineering at Stony Brook University, in New York. He is working on a project funded by the U.S. Army Research Office to evaluate how well the two new logic families perform when scaled up to simulated 32-bit processors.

Dorojevets says these designs help build a foundation for ultralow-power superconducting processors that could potentially work at speeds of 20 to 50 GHz. But it will be quite some time before such processors make their way into supercomputing facilities. While superconducting logic is making strides, other processor components—particularly memory—require a lot more development.

Limited resources could be the biggest obstacle to superconducting supercomputers. Today's machines take advantage of a large industry that has invested many billions of dollars developing chips for other purposes. "If you choose to build an exascale system starting from scratch with an unproven device technology, you must pay for everything. I don't see an entity or agency with the money and the incentive," says Thomas Theis, a program manager at IBM's Thomas J. Watson Research Center, in Yorktown Heights, N.Y.

"It's a very big and aggressive goal," Herr admits. "But revolutions usually don't happen all at once."

Advertisement
Advertisement