The December 2022 issue of IEEE Spectrum is here!

Close bar

Oak Ridge Unveils 20-Petaflop ‘Titan’ Supercomputer

New 299,008-CPU Cray XK7 will simulate nanomagnets, global climate, and turbulent combustion

2 min read
Titan, ORNL's new 20 pflop supercomputer
Oak Ridge National Laboratory

Partially overshadowed by the dislocations of Hurricane Sandy was Oak Ridge National Laboratory’s unveiling of its Titan supercomputer, a 20-petaflop Cray XK7 that will crunch massive numbers to run simulations  in materials science, combustion, and, appropriately, climate change. (In the shadow of the storm, it’s interesting to note how many of Titan’s non-weather applications also have environmental implications.)

The system contains 18,688 nodes, each containing a 16-core AMD Opteron 6274 CPU and an NVIDIA Tesla K20 graphics processing unit. The design is 10 times as powerful as ORNL’s previous supercomputer, the Jaguar, but it fits into the same space and uses only a little more power.

“Combining GPUs and CPUS in a single system requires less power than CPUs alone, and is a responsible move towards lowering our carbon footprint,” said ORNL associate director Jeff Nichols in the debut announcement. Titan’s 299,008 CPUs will guide the complex simulations, while the even faster multi-core GPUs will handle the details.

It will take a while before Titan finishes acceptance testing. When it goes online, its biggest client will be the Department of Energy’s INCITE (Innovative and Novel Computational Impact on Theory and Experiment) program.

The biggest-iron to-do list includes:

  • Calculating nanoscale magnetic properties and temperature sensitivities of steels, nickel-iron alloys, and advanced permanent magnets using Wang-Landau locally self-consistent multiple scattering (WL-LSMS) methods.
  • Modeling combustion in the turbulent environment of an internal combustion engine—potentially important to improving engine designs that will both conserve fossil fuel resources and reduce greenhouse gas production.
  • Modeling the behavior of neutrons in a nuclear power reactor—part of a study intended to help extend the working lives of aging reactors that still provide about 20% of America’s power. (ORNL says Titan will be able to simulate one fuel-rod service cycle in 13 hours, less than a quarter of the time Jaguar needed.)
  • Simulating the long-term evolution of the world’s climate, helping to anticipate future air quality and the behavior of suspended particles. The simulation will reduce the world to an array of 14x14 km cells, “imagining” five years of real time per day of computing time. (Jaguar could simulate just three months in a day of calculation.)
The Conversation (0)

Why Functional Programming Should Be the Future of Software Development

It’s hard to learn, but your code will produce fewer nasty surprises

11 min read
Vertical
A plate of spaghetti made from code
Shira Inbar
DarkBlue1

You’d expectthe longest and most costly phase in the lifecycle of a software product to be the initial development of the system, when all those great features are first imagined and then created. In fact, the hardest part comes later, during the maintenance phase. That’s when programmers pay the price for the shortcuts they took during development.

So why did they take shortcuts? Maybe they didn’t realize that they were cutting any corners. Only when their code was deployed and exercised by a lot of users did its hidden flaws come to light. And maybe the developers were rushed. Time-to-market pressures would almost guarantee that their software will contain more bugs than it would otherwise.

Keep Reading ↓Show less
{"imageShortcodeIds":["31996907"]}