The December 2022 issue of IEEE Spectrum is here!

Close bar

The Coral Dev Board Takes Google’s AI to the Edge

A tensor processing unit with a Raspberry Pi–style form factor brings machine learning to makers

4 min read
Photo of Google's Coral Dev Board
Photo: Randi Klett

There’s a steady drumbeat about how deep learning is going to touch nearly every area of technology, not least from us here at IEEE Spectrum. So, like a lot of folks, I’ve been interested in getting my hands dirty and learning some of the tools and techniques. But, like a lot of folks, I’ve been stymied by the difficulty of getting up and running.

I’ve tried to use TensorFlow—Google’s open-source machine-learning library—and Keras, another library that acts as a high-level interface between Python programs and machine-learning back ends like TensorFlow. But it’s been a discouraging exercise in going down software-dependency rabbit holes and sifting through fragmented and often obsolete documentation. And then, just when everything finally seems to be working, something breaks thanks to a system upgrade. As the old hacker lament goes, “You are in a maze of twisty little passages, all alike.”

Keep Reading ↓Show less

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions

From WinZips to Cat GIFs, Jacob Ziv’s Algorithms Have Powered Decades of Compression

The lossless-compression pioneer received the 2021 IEEE Medal of Honor

11 min read
Vertical
Photo of Jacob Ziv
Photo: Rami Shlush
Yellow

Lossless data compression seems a bit like a magic trick. Its cousin, lossy compression, is easier to comprehend. Lossy algorithms are used to get music into the popular MP3 format and turn a digital image into a standard JPEG file. They do this by selectively removing bits, taking what scientists know about the way we see and hear to determine which bits we'd least miss. But no one can make the case that the resulting file is a perfect replica of the original.

Not so with lossless data compression. Bits do disappear, making the data file dramatically smaller and thus easier to store and transmit. The important difference is that the bits reappear on command. It's as if the bits are rabbits in a magician's act, disappearing and then reappearing from inside a hat at the wave of a wand.

Keep Reading ↓Show less