The December 2022 issue of IEEE Spectrum is here!

Close bar

Syntiant AI Chip Means Alexa Wakes Up for Small-Battery Systems

Amazon certifies Syntiant's microwatt-powered deep learning chip

2 min read
Image of the chip on a penny for size comparison.
Photo: Syntiant

Syntiant’s custom AI chip has passed Amazon’s Alexa qualification, the Irvine, Calif. startup announced on Monday. The NDP100 series of chips can recognize up to 63 words or other sensor patterns while consuming just 150 microwatts, a 200-fold improvement over what a typical microcontroller could offer, the company says. It’s passing Amazon’s stringent signal-to-noise tests mean Bluetooth earphones and other small-battery-operated devices will be able to listen for wake words and other commands that bring them to full power.

“This allows the Amazon Echo ecosystem to expand into most battery powered devices, which it couldn’t previously do,” says Syntiant CEO Kurt Busch. “You really couldn’t do always on listening with small battery powered devices, and now you can.”

Sanjay Voleti, senior manager of device enablement with Alexa Voice Service was impressed with Syntiant’s solution. “We’re excited to see developers begin using this technology in their devices and deliver new Alexa experiences for customers,” he said in a press release.

The chip connects directly to a digital microphone or other sensor and triggers an “interrupt” line connecting to the larger—usually sleeping—system. Once that system’s yawned and stretched, it can then interrogate the NDP100 to determine what wake word or command it heard. The chip also keeps a three-second audio buffer in case the system needs to catch up on what was said during its wakeup routine.

Like most AI ASICs, the chip only performs the inferencing step of deep learning. The training happens in the cloud using the common TensorFlow software library. One of the chip’s biggest advantages for developers, according to Busch, is that it doesn’t require any compiling or optimization step between the trained network and what goes on the chip. “It’s fair to say that the guts of our chip look like the Tensor Flow graph,” says Busch.

Busch says Syntiant is now working on a larger, more capable version of the chip to expand into more markets. It is also developing a version that uses analog computation and embedded flash memory to do AI computations with less power.

The Conversation (0)

Will AI Steal Submarines’ Stealth?

Better detection will make the oceans transparent—and perhaps doom mutually assured destruction

11 min read
A photo of a submarine in the water under a partly cloudy sky.

The Virginia-class fast attack submarine USS Virginia cruises through the Mediterranean in 2010. Back then, it could effectively disappear just by diving.

U.S. Navy

Submarines are valued primarily for their ability to hide. The assurance that submarines would likely survive the first missile strike in a nuclear war and thus be able to respond by launching missiles in a second strike is key to the strategy of deterrence known as mutually assured destruction. Any new technology that might render the oceans effectively transparent, making it trivial to spot lurking submarines, could thus undermine the peace of the world. For nearly a century, naval engineers have striven to develop ever-faster, ever-quieter submarines. But they have worked just as hard at advancing a wide array of radar, sonar, and other technologies designed to detect, target, and eliminate enemy submarines.

The balance seemed to turn with the emergence of nuclear-powered submarines in the early 1960s. In a 2015 study for the Center for Strategic and Budgetary Assessment, Bryan Clark, a naval specialist now at the Hudson Institute, noted that the ability of these boats to remain submerged for long periods of time made them “nearly impossible to find with radar and active sonar.” But even these stealthy submarines produce subtle, very-low-frequency noises that can be picked up from far away by networks of acoustic hydrophone arrays mounted to the seafloor.

Keep Reading ↓Show less