A Middle East Supercomputer Makes the Top 10 List for the First Time

Saudi Arabia's Shaheen II is the region's first supercomputer to compete in the big leagues

2 min read
A Middle East Supercomputer Makes the Top 10 List for the First Time
Photo: KAUST

For the first time, a system in the Middle East earned a Top 10 spot on the Top500.org list of most powerful supercomputers. Shaheen II, located at King Abdullah University of Science and Technology (KAUST), in Saudi Arabia, placed 7th in the the semi-annual competition, the results of which were announced earlier today. Shaheen II is a Cray XC40 system that cranked out 5.536 petaflops per second on the Linpack benchmark.

Shaheen II replaced the Shaheen I in April 2015. The 16-rack IBM Blue Gene/P supercomputer system and has 6,100 sets of 32 processor cores. At KAUST, 25 percent on the university’s faculty, students and researchers rely on Shaheen II, the university said in a press release. The system is used for and small- and large-scale scientific research, including global climate projects and visualizations of the brain and DNA.

Shaheen II was the only new addition to the Top 10 from the previous November 2014 list.

At the top of the list, China and U.S. battled it out for the number one position. But, Tianhe-2 did it again. The supercomputer developed by the National University of Defense Technology in Guangzhou, China, held its number one title for the fifth consecutive time. No other supercomputer was able to beat Tianhe-2’s max calculation capacity of 33.86 petaflops per second. The top supercomputer in the United States, Oak Ridge National Laboratory’s Titan, remained at its number two spot achieving 17.59 petaflops per second.    

Top500.org is celebrating its 45th list of high-performance computers. Since June of 1993, supercomputer experts, computational scientists, manufacturers, and others have helped contribute to the list. The supercomputers’ performance is evaluated on the Linpack benchmark.

Here are the Top 10 from the 45th Top500 list:

  1. Tianhe-2  (33.86 petaflops)
  2. Titan (17.59 petaflops)
  3. Sequoia (17.13 petaflops)
  4. K Computer (10.51 petaflops)
  5. Mira (8.59 petaflops)
  6. Piz Daint (6.27 petaflops)
  7. Shaheen II (5.54 petaflops)
  8. Stampede (5.17 petaflops)
  9. JUQUEEN (5.01 petaflops)
  10. Vulcan (4.29 petaflops)
The Conversation (0)

The Future of Deep Learning Is Photonic

Computing with light could slash the energy needs of neural networks

10 min read

This computer rendering depicts the pattern on a photonic chip that the author and his colleagues have devised for performing neural-network calculations using light.

Alexander Sludds
DarkBlue1

Think of the many tasks to which computers are being applied that in the not-so-distant past required human intuition. Computers routinely identify objects in images, transcribe speech, translate between languages, diagnose medical conditions, play complex games, and drive cars.

The technique that has empowered these stunning developments is called deep learning, a term that refers to mathematical models known as artificial neural networks. Deep learning is a subfield of machine learning, a branch of computer science based on fitting complex models to data.

Keep Reading ↓ Show less