The August 2022 issue of IEEE Spectrum is here!

Close bar

Cartesiam Hopes to Make Embedded AI Easier for Everyone

Startup’s tool should let embedded AI novices bring unsupervised learning to Arm microcontrollers

3 min read
Detail of water cleaning plant: large filters (canisters), pipelines and electronics switchboards
Sensors on industrial machines to provide predictive maintenance are a key application of Cartesiam's unsupervised learning systems.
Photo: iStockphoto

French startup Cartesiam was founded because of the predicted inundation of IoT sensors and products. Even a few years ago, the idea was that these tens of billions of smart sensors would deliver their data to the cloud. AI and other software there would understand what it meant and trigger the appropriate action.

As it did to many others in the embedded systems space, this scheme looked a little ludicrous. “We were thinking: it doesn’t make sense,” says general manager and cofounder Marc Dupaquier. Transporting all that data was expensive in terms of energy and money, it wasn’t secure, it added latency between an event and the needed reaction, and it was a privacy-endangering use of data. So Cartesiam set about building a system that allows ordinary Arm microcontrollers to run a kind of AI called unsupervised learning.  

Putting AI “at the edge” is a goal of both startups like Cartesiam and big companies alike, but in terms of tools, data, and expertise, the people who actually build embedded systems and program microcontrollers aren’t in a position to take advantage of it, says Dupaquier. So Cartesiam is launching a software system that securely generates AI algorithms to run on Arm microprocessors based on only two minutes of the embedded sensor’s data and taking up only 4 to 16 kilobytes of RAM.

“It allows any embedded designer to develop application-specific machine learning libraries quickly and run the program inside the microcontroller right where the signal becomes data,” says Dupaquier.

The type of machine learning involved, unsupervised learning, is actually key to the company’s success so far, says Dupaquier. Much of the machine learning today that recognizes faces and reads road signs is of the convolutional neural network, or deep learning, type. Those networks are usually trained in data centers on a diet of thousands of examples of each of the things they are supposed to recognize. The trained network can then be ported to less-powerful computers.

This scheme presents a number of challenges to embedded systems makers, says Dupaquier. Deep learning needs lots of data—huge numbers of examples of all the things it’s supposed to discover in the real world. In the world of sensors controlled by microcontrollers, those data sets are very hard to generate, if they exist at all, he says. Data scientists that could help are both rare and expensive. And even if the data were available, less than one percent of embedded developers have AI skills, according to IDC. “Most of our clients don’t know about AI,” he says.

Unsupervised learning instead offers the chance for sensors to build “digital twins” of themselves as they operate. Using a two-minute sample of normal and aberrant operation, Cartesiam’s NanoEdge AI Studio picks the best combination of AI algorithms to build the network from. It then ports those algorithms to the embedded controller’s memory. As the sensor operates in the environment, it simultaneously learns what’s normal and watches the data for meaningful deviations from that. Eventually, it can predict problems before they arise. (Eolane’s Bob Assistant, a temperature and vibration sensor for predictive maintenance, was among Cartesiam’s first wins.)

The neural network on the microprocessor is likely to be different for each sensor because of the peculiarities of the environment around it, explains Dupaquier. For example, the vibrations that are normal in one pipe in a water-treatment plant might be a sign of impending doom in another. “Because learning is made on device it will learn the pattern of this machine,” he says. The AI is “building a digital twin of the machine into the microcontroller.”

The desire to stuff machine learning into low-power, low-resource processors is the driving force behind a number of startups and quite a few developments by processor companies, too. Startups are using specialized computer architectures, compute-in-memory schemes, and other hardware tricks to produce chips that run deep learning and other networks at low power. Earlier this month, processor giant Arm unveiled machine-learning acceleration offerings that boost Cortex-M5 ML performance up to 15-fold. Adding a separate accelerator, the Ethos U55, boosts ML performance 480-fold, according to Arm.

Cartesiam has already had the chance to test the new hardware. “This is good news for us,” says Dupaquier.

The Conversation (0)

The First Million-Transistor Chip: the Engineers’ Story

Intel’s i860 RISC chip was a graphics powerhouse

21 min read
Twenty people crowd into a cubicle, the man in the center seated holding a silicon wafer full of chips

Intel's million-transistor chip development team

In San Francisco on Feb. 27, 1989, Intel Corp., Santa Clara, Calif., startled the world of high technology by presenting the first ever 1-million-transistor microprocessor, which was also the company’s first such chip to use a reduced instruction set.

The number of transistors alone marks a huge leap upward: Intel’s previous microprocessor, the 80386, has only 275,000 of them. But this long-deferred move into the booming market in reduced-instruction-set computing (RISC) was more of a shock, in part because it broke with Intel’s tradition of compatibility with earlier processors—and not least because after three well-guarded years in development the chip came as a complete surprise. Now designated the i860, it entered development in 1986 about the same time as the 80486, the yet-to-be-introduced successor to Intel’s highly regarded 80286 and 80386. The two chips have about the same area and use the same 1-micrometer CMOS technology then under development at the company’s systems production and manufacturing plant in Hillsboro, Ore. But with the i860, then code-named the N10, the company planned a revolution.

Keep Reading ↓Show less
{"imageShortcodeIds":[]}