The August 2022 issue of IEEE Spectrum is here!

Close bar

Tianhe-2 Continues Reign As World’s Best Supercomputer

Little change since last year in ranking of world's most powerful machines

2 min read
Tianhe-2 Continues Reign As World’s Best Supercomputer
Photo: Imaginechina/AP Photo

Since June 2013,’s list of supercomputers has been consistently topped by Tianhe-2 at the National Super Computer Center in Guangzhou, China. The latest list released today reaffirm’s the Chinese supercomputer’s supremacy, with a performance speed of 33.86 petaflops, or 33.86 thousand trillion floating point operations per second.

In fact, an extended pause in supercomputer growth means there was very little change among the world’s top 10 ranking of supercomputers. The only new entry was at number 10, with a Cray CS-Storm system—clocking in at 3.57 petaflops and based at an undisclosed U.S. government site—taking that spot. 

The U.S. still retains its position as the country with the most systems—231 to be exact. However, this number continues to trend downward, from 233 in June of this year and further down from 265 last November. Asian systems also dropped from 132 to 120. On the other hand, European systems, increased to 130, up from 116 in June.

Tianhe-2 (also known as Milky Way-2) consists of 16,000 nodes that each contain two Intel Xeon IvyBridge processors and three Xeon Phi processors, adding up to a total of 3.12 million computing cores. "Most of the features of the system were developed in China,” Top 500 editor Jack Dongarra said in a June 2013 statement. “They are only using Intel for the main compute part. The interconnect, operating system, front-end processors, and software are mainly Chinese.”

Coming in at number two was Titan, a 17.59 petaflops system located at the U.S. Department of Energy’s (DOE) Oak Ridge National Laboratory. Sequoia, installed at DOE’s Lawrence Livermore National Laboratory, is again the number three system, achieving 17.17 petaflops. Like Tianhe-2, both computers also retained their rankings for the fourth consecutive time.

However, these two labs are hoping for a number one spot in 2017. A consortium headed by IBM is getting US$325 million to build two systems that will weigh in at more than 1000 petaflops each. They are based on a new “data-centric” supercomputer architecture that IBM is working on.

Top 500 releases the list of top supercomputers twice a year. A more detailed analysis of the latest rankings will be released tomorrow. Below is a list of the top 10, ranked by performance(measured using the LINPACK Benchmark):

  1. Tianhe-2 (MilkyWay-2) (33.86 petaflops)
  2. Titan (17.59 petaflops)
  3. Sequoia (17.17 petaflops)
  4. K computer (10.51 petaflops)
  5. Mira (8.59 petaflops)
  6. Piz Daint (6.27 petaflops)
  7. Stampede (5.17 petaflops)
  8. JUQUEEN (5.01 petaflops)
  9. Vulcan (4.29 petaflops)
  10. Cray CS-Storm (3.57 petaflops) 
The Conversation (0)

Quantum Error Correction: Time to Make It Work

If technologists can’t perfect it, quantum computers will never be big

13 min read
Quantum Error Correction: Time to Make It Work
Chad Hagen

Dates chiseled into an ancient tombstone have more in common with the data in your phone or laptop than you may realize. They both involve conventional, classical information, carried by hardware that is relatively immune to errors. The situation inside a quantum computer is far different: The information itself has its own idiosyncratic properties, and compared with standard digital microelectronics, state-of-the-art quantum-computer hardware is more than a billion trillion times as likely to suffer a fault. This tremendous susceptibility to errors is the single biggest problem holding back quantum computing from realizing its great promise.

Fortunately, an approach known as quantum error correction (QEC) can remedy this problem, at least in principle. A mature body of theory built up over the past quarter century now provides a solid theoretical foundation, and experimentalists have demonstrated dozens of proof-of-principle examples of QEC. But these experiments still have not reached the level of quality and sophistication needed to reduce the overall error rate in a system.

Keep Reading ↓Show less