Get on the Optical Bus

IBM’s light-powered links overcome the greatest speed bump in supercomputing: interconnect bandwidth

10 min read
Opening illustration
Illustration: McKibillo

Sad but true: About three-quarters of the time, your computer processor is doing nothing more than waiting for data—the cybernetic equivalent of twiddling one’s thumbs. It doesn’t matter whether you’ve got the latest processor, surrounded it with high-speed RAM, or lovingly hot-rodded your system with the latest in liquid cooling. Your speed is primarily set not by the processing power you have but by the connections that stand between that processor and the data it needs.

The problem is that data transfer is accomplished by the movement of an electronic signal along old-fashioned copper wires—the same basic phenomenon that a century and a half ago carried news of the U.S. Civil War over telegraph lines. It’s time we saw the light—literally—and stopped shackling ourselves to electrons moving along copper conductors.

Keep reading...Show less

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions

New AI Speeds Computer Graphics by Up to 5x

Neural rendering harnesses machine learning to paint pixels

5 min read
Four examples of Nvidia's Instant NeRF 2D-to-3D machine learning model placed side-by-side.

Nvidia Instant NeRF uses neural rendering to generate 3D visuals from 2D images.

NVIDIA

On 20 September, Nvidia’s Vice President of Applied Deep Learning, Bryan Cantanzaro, went to Twitter with a bold claim: In certain GPU-heavy games, like the classic first-person platformer Portal, seven out of eight pixels on the screen are generated by a new machine-learning algorithm. That’s enough, he said, to accelerate rendering by up to 5x.

This impressive feat is currently limited to a few dozen 3D games, but it’s a hint at the gains neural rendering will soon deliver. The technique will unlock new potential in everyday consumer electronics.

Keep Reading ↓Show less

Golf Robot Learns To Putt Like A Pro

Watch out Tiger Woods, Golfi has a mean short game

4 min read
Golf Robot Learns To Putt Like A Pro

While being able to drive the ball 300 yards might get the fans excited, a solid putting game is often what separates a golf champion from the journeymen. A robot built by German researchers is quickly becoming a master of this short game using a clever combination of classical control engineering and machine learning.

In golf tournaments, players often scout out the greens the day beforehand to think through how they are going to play their shots, says Annika Junker, a doctoral student at Paderborn University in Germany. So she and her colleagues decided to see if giving a robot similar capabilities could help it to sink a putt from anywhere on the green, without assistance from a human.

Keep Reading ↓Show less

Fourth Generation Digitizers With Easy-to-Use API

Learn about the latest generation high-performance data acquisition boards from Teledyne

1 min read

In this webinar, we explain the design principles and operation of our fourth-generation digitizers with a focus on the application programming interface (API).

Register now for this free webinar!

Keep Reading ↓Show less