Low-Power AI Startup Eta Compute Delivers First Commercial Chips

The firm pivoted away from riskier spiking neural networks using a new power management scheme

3 min read
A computer chip with graphs next to it
Illustration: Shutterstock

When Eta Compute began, it was among the few adherents of spiking neural networks (SNNs) as the low-power path to AI for small and battery-power-constrained sensors and gadgets. But even as the startup was showing off its successes at this, it was realizing SNNs were not ready for prime time.

Using an unrelated technology the company had in development, Eta Compute pivoted toward more traditional neural networks such as deep learning and is reaping the rewards. The West Lake Village, Calif.-based company revealed on Wednesday that its first production chips using that technology are now shipping.

“We've got a number of customers lined up working on a number of different projects,” says CEO Ted Tewksbury. “These customers have just been waiting for the production silicon and as soon as we get it in their hands, we are confident that we're going to start to sell.”

The chip, called the ECM 3532, consumes as little as 100 microwatts and is designed to perform AI-enabled tasks such as object identification, recognize wake words or sounds, and analyze data from a variety of sensors. Eta Compute’s technology “is orders of magnitude more power-efficient than any other technology I have seen to date, and it will certainly make AI at the edge a reality,” said Jim Feldhan, president and founder at Semico Research in a press release.

The chip is emblematic of a drive to provide AI to “edge” devices, such as battery-powered cameras, microphones, and other IoT sensors. Part of the attraction is that battery power can be saved by not having to transmit a continuous stream of data to the cloud. For example, a smart building control system could use a low-power, low-resolution camera to determine if a room is occupied. Rather than send a raw stream of video to an off-site computer, AI embedded in the camera could simply report the number of people it detects.

The ECM 3532 is a system-on-chip built around an Arm Cortex-M3 processor core and an NXP CoolFlux DSP core. But the key is an in-house technology called continuous voltage and frequency scaling, or CVFS. CVFS allows the system to throttle the voltage and frequency of each core independently. Lowering the operating voltage of a circuit and the frequency at which it operates saves power—however, it slows computation as well. But because sensor data tends to be episodic or “lumpy”, most of the time the ECM 3532 can run at low voltage and low frequency. For example, an alarm system microphone waiting to hear the sound of braking glass would have the ECM 3532 running at low voltage and low frequencies most of the time.

CVFS might sound similar to a microprocessor technology called dynamic voltage and frequency scaling (DVFS), but there are key differences, explains Tewksbury. DVFS allows only a set of discrete voltages and frequencies. But in CVFS voltage and frequency can range over a continuum. An algorithm constantly examines the cores’ incoming workloads and determines the voltage and frequency they require to handle the work using a minimum of energy.

Power efficiency was part of Eta Compute’s original attraction to spiking neural networks. In a demo of Eta Compute’s TENSAI chip in 2018, the network recognized a cheetah in a video using fewer than 1,000 pixels, while a convolutional neural network needed 100,000. But even as Eta Compute was demonstrating TENSAI’s capabilities there were warning signs that it wasn’t the way to go.

For one, the asynchronous logic—where computing proceeds without a clock—needed for the spiking network took up a lot of area, which could be a problem if your target market is small, battery-driven systems. Another problem is that the network’s efficiency, while excellent at lower frequencies, dropped off above 5 megahertz. And finally, the state of spiking neural network algorithms “was nowhere near ready for prime time,” says Tewksbury.

Eta Compute made the announcement regarding ECM 3532 at the TinyML Summit, in San Jose.

The Conversation (0)

The Transistor of 2047: Expert Predictions

What will the device be like on its 100th anniversary?

4 min read
Six men and a woman smiling.

The luminaries who dared predict the future of the transistor for IEEE Spectrum include: [clockwise from left] Gabriel Loh, Sri Samavedam, Sayeef Salahuddin, Richard Schultz, Suman Datta, Tsu-Jae King Liu, and H.-S. Philip Wong.


The 100th anniversary of the invention of the transistor will happen in 2047. What will transistors be like then? Will they even be the critical computing element they are today? IEEE Spectrum asked experts from around the world for their predictions.

Keep Reading ↓Show less