Neural networks are artificial intelligence systems that excel at interpreting images. This makes them promising for helping drones and robots navigate, or for analyzing surveillance footage. But they are typically power hungry, which has limited their use so far. Vision processor company Movidius of San Mateo, Calif., hopes to change that with a low-power chip designed to run neural networks. The neural net accelerator, called Fathom, comes on a USB stick, uses only 1 watt of power, and can run most visual neural nets.
Making a low-power system that can run computationally intensive neural networks is a challenge. Neural networks make sense of images in a way that’s analogous to the human brain. They’re built during a training period that uses millions of labeled images to train an algorithm to recognize human faces, dogs, or trees, for example. These systems analyze images in several steps or layers, first finding objects, then identifying them as, say, trees or people, then identifying a known person, then figuring out, say, if that person is angry or distressed. This typically requires a lot of power-draining data transfers.
The Fathom, which holds Movidius’s Myriad 2 chip, does things differently. The Myriad 2 uses twelve parallel processors, each with a dedicated memory bank. “We have a mindset of not pushing the clock speed,” says Brick. Instead of running one or a handful of processors harder to do calculations faster, they chose to use twelve running in parallel. The chip also saves power by minimizing data transfers. “We keep the data really close to where it’s being processed—sometimes moving the data can use more power than processing the data,” says Brick. This strategy is similar to those used by research groups designing mobile processors of this type—but it will be the first of its kind to market, says Brick.
The accelerator can run neural networks like GoogLeNet at 1 watt. NVIDIA’s TX1 runs on a minimum of 4 watts, by comparison, and draws 1 W even when it’s idling. The Myriad 2 idles at 0.12 W. What’s more, the Fathom doesn’t require a heat sink or any other cooling systems, which is part of why it’s small enough to fit on a USB.
Cormac Brick, head of machine learning at Movidius, says this mobile-friendly system should make it practical to run neural networks in more places. Brick says a Fathom stick loaded with the right neural net could help individual surveillance cameras flag problems—alerting a home user that his father has fallen and can’t get up or alerting airport security staff that someone is acting aggressively—something it usually takes a human watching the footage to notice.
Other companies like Nervana Systems want to put deep learning in the cloud. For people concerned about privacy, running neural nets in a mobile system will be preferable to uploading video footage of their home into the cloud for analysis, says Brick. The speed of on-board neural nets will also help robots and drones navigate. The Fathom system could help them more quickly respond when an obstacle moves into their path, without any of the inescapable latency resulting from data being sent back and forth.
In a Movidius press release, Yann LeCun, director of AI research at Facebook, praised the device, saying, “every robot, big and small, can now have state-of-the-art vision capabilities.”
The Fathom is not aimed at the consumer who wants a less clumsy drone or a smarter home security system, but at the at companies and researchers developing such products. Users will have to know something about building embedded systems. Brick says it’s compatible with Raspberry Pi, drone kits with open API, as well as kits for security cameras and robots. It can run neural nets based on open-source software libraries TensorFlow and Caffe. Users developing new neural nets can also use the Fathom with their personal computers to more quickly test their prototypes.
Movidius will send about 1000 of the Fathom sticks to researchers at universities and companies, and encourages university teams who want to try it out to get in touch through their website. After distributing the neural-net-on-a-stick on a case-by-case basis for a while, they’ll launch the product more broadly. The company says the USB stick will sell for under $100 at launch.
Katherine Bourzac is a freelance journalist based in San Francisco, Calif. She writes about materials science, nanotechnology, energy, computing, and medicine—and about how all these fields overlap. Bourzac is a contributing editor at Technology Review and a contributor at Chemical & Engineering News; her work can also be found in Nature and Scientific American. She serves on the board of the Northern California chapter of the Society of Professional Journalists.