The February 2023 issue of IEEE Spectrum is here!

Close bar

Steve Kirsch

As a junior high kid, he attended the birth of the Internet; in college, he invented the optical mouse; now he's launching a company to sell 'e-commerce in a box'

13 min read

Computer centers were intimidating places in 1969. Machines were huge, locked in air-conditioned rooms, and fed with punched cards. Time on them did not come cheaply and was tightly rationed. And the computer room in Boelter Hall at the University of California, Los Angeles (UCLA), was no exception.

In one corner of the room sat a state-of-the-art Scientific Data Systems Sigma 7 computer. Off limits to the countless engineering students, it was reserved for a small group of researchers funded by the Defense Advanced Research Projects Agency (Darpa) and busy inventing the technology that would evolve into the Internet.

Keep Reading ↓Show less

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions

From WinZips to Cat GIFs, Jacob Ziv’s Algorithms Have Powered Decades of Compression

The lossless-compression pioneer received the 2021 IEEE Medal of Honor

11 min read
Photo of Jacob Ziv
Photo: Rami Shlush

Lossless data compression seems a bit like a magic trick. Its cousin, lossy compression, is easier to comprehend. Lossy algorithms are used to get music into the popular MP3 format and turn a digital image into a standard JPEG file. They do this by selectively removing bits, taking what scientists know about the way we see and hear to determine which bits we'd least miss. But no one can make the case that the resulting file is a perfect replica of the original.

Not so with lossless data compression. Bits do disappear, making the data file dramatically smaller and thus easier to store and transmit. The important difference is that the bits reappear on command. It's as if the bits are rabbits in a magician's act, disappearing and then reappearing from inside a hat at the wave of a wand.

Keep Reading ↓Show less