The August 2022 issue of IEEE Spectrum is here!

Close bar

Oak Ridge Unveils 20-Petaflop ‘Titan’ Supercomputer

New 299,008-CPU Cray XK7 will simulate nanomagnets, global climate, and turbulent combustion

2 min read
Titan, ORNL's new 20 pflop supercomputer
Oak Ridge National Laboratory

Partially overshadowed by the dislocations of Hurricane Sandy was Oak Ridge National Laboratory’s unveiling of its Titan supercomputer, a 20-petaflop Cray XK7 that will crunch massive numbers to run simulations  in materials science, combustion, and, appropriately, climate change. (In the shadow of the storm, it’s interesting to note how many of Titan’s non-weather applications also have environmental implications.)

The system contains 18,688 nodes, each containing a 16-core AMD Opteron 6274 CPU and an NVIDIA Tesla K20 graphics processing unit. The design is 10 times as powerful as ORNL’s previous supercomputer, the Jaguar, but it fits into the same space and uses only a little more power.

“Combining GPUs and CPUS in a single system requires less power than CPUs alone, and is a responsible move towards lowering our carbon footprint,” said ORNL associate director Jeff Nichols in the debut announcement. Titan’s 299,008 CPUs will guide the complex simulations, while the even faster multi-core GPUs will handle the details.

It will take a while before Titan finishes acceptance testing. When it goes online, its biggest client will be the Department of Energy’s INCITE (Innovative and Novel Computational Impact on Theory and Experiment) program.

The biggest-iron to-do list includes:

  • Calculating nanoscale magnetic properties and temperature sensitivities of steels, nickel-iron alloys, and advanced permanent magnets using Wang-Landau locally self-consistent multiple scattering (WL-LSMS) methods.
  • Modeling combustion in the turbulent environment of an internal combustion engine—potentially important to improving engine designs that will both conserve fossil fuel resources and reduce greenhouse gas production.
  • Modeling the behavior of neutrons in a nuclear power reactor—part of a study intended to help extend the working lives of aging reactors that still provide about 20% of America’s power. (ORNL says Titan will be able to simulate one fuel-rod service cycle in 13 hours, less than a quarter of the time Jaguar needed.)
  • Simulating the long-term evolution of the world’s climate, helping to anticipate future air quality and the behavior of suspended particles. The simulation will reduce the world to an array of 14x14 km cells, “imagining” five years of real time per day of computing time. (Jaguar could simulate just three months in a day of calculation.)
The Conversation (0)

Quantum Error Correction: Time to Make It Work

If technologists can’t perfect it, quantum computers will never be big

13 min read
Quantum Error Correction: Time to Make It Work
Chad Hagen
Blue

Dates chiseled into an ancient tombstone have more in common with the data in your phone or laptop than you may realize. They both involve conventional, classical information, carried by hardware that is relatively immune to errors. The situation inside a quantum computer is far different: The information itself has its own idiosyncratic properties, and compared with standard digital microelectronics, state-of-the-art quantum-computer hardware is more than a billion trillion times as likely to suffer a fault. This tremendous susceptibility to errors is the single biggest problem holding back quantum computing from realizing its great promise.

Fortunately, an approach known as quantum error correction (QEC) can remedy this problem, at least in principle. A mature body of theory built up over the past quarter century now provides a solid theoretical foundation, and experimentalists have demonstrated dozens of proof-of-principle examples of QEC. But these experiments still have not reached the level of quality and sophistication needed to reduce the overall error rate in a system.

Keep Reading ↓Show less
{"imageShortcodeIds":["29986363","29986364"]}