The August 2022 issue of IEEE Spectrum is here!

Close bar
Illustration by Greg Mably
Illustration: Greg Mably
At the center of this will be the idea of digital convergence. That is, taking all the information—books, catalogs, shopping approaches, professional advice, art, movies—and taking those things in their digital form, ones and zeroes, and being able to provide them on demand on a device looking like a TV, a small device you carry around, or what the PC will evolve into.

Microsoft founder Bill Gates is famous for many things, including being the world’s richest person (until recently, anyway) and being the world’s unofficial Alpha Geek. However, in tech circles he’s also known for his long-standing belief that, as he once put it, ”any piece of information you want should be available to you.” The idea wasn’t new—as far back as the 1970s, the motto of the Information Industry Association was “Putting Information at Your Fingertips”—but Gates championed it as early as 1989, and he was in a position to do something about it. It remained his overriding goal for the next two decades.

In fact, you could argue that IAYF (as the cooler geeks now call it) has been the goal for the entire tech sector for the past 20 years, particularly since the Internet broke out of its academic cloister and started cavorting in the mainstream. But a funny thing happened between then and now: Quietly and without much fuss, this seemingly futuristic goal has pretty much become a reality. Wondering if that restaurant you see from your car window is any good? Ask your car’s GPS system. Somebody at dinner claims that Dustin Hoffman was in Star Wars? Whip out your iPhone and look it up in the Internet Movie Database—that information is iPhoneable.

Keep Reading ↓Show less

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions

Quantum Error Correction: Time to Make It Work

If technologists can’t perfect it, quantum computers will never be big

13 min read
Quantum Error Correction: Time to Make It Work
Chad Hagen
Blue

Dates chiseled into an ancient tombstone have more in common with the data in your phone or laptop than you may realize. They both involve conventional, classical information, carried by hardware that is relatively immune to errors. The situation inside a quantum computer is far different: The information itself has its own idiosyncratic properties, and compared with standard digital microelectronics, state-of-the-art quantum-computer hardware is more than a billion trillion times as likely to suffer a fault. This tremendous susceptibility to errors is the single biggest problem holding back quantum computing from realizing its great promise.

Fortunately, an approach known as quantum error correction (QEC) can remedy this problem, at least in principle. A mature body of theory built up over the past quarter century now provides a solid theoretical foundation, and experimentalists have demonstrated dozens of proof-of-principle examples of QEC. But these experiments still have not reached the level of quality and sophistication needed to reduce the overall error rate in a system.

Keep Reading ↓Show less
{"imageShortcodeIds":["29986363","29986364"]}