The December 2022 issue of IEEE Spectrum is here!

Close bar

IBM Reveals 8-Bit Analog Chip With Phase-Change Memory

Researchers used the chip to test a simple neural net and identify numerals with 100 percent accuracy

2 min read
At the 2018 IEEE International Electron Devices Meeting (IEDM), IBM scientists have published research on a new type of in-memory computing device which can compute between 100-1000 times lower energy levels when compared to today's technology.
Photo: IBM

Today at the IEEE International Electron Devices Meeting in San Francisco, IBM reported a new 8-bit analog chip. But the true development was less about analog chips catching up to their digital peers and more a radical rethink of chip architecture. This chip is the first to perform 8-bit calculations right where information is stored.

In traditional von Neumann chip architecture, data constantly shuttles between memory and processing, which consumes valuable energy and time, says Abu Sebastian, the lead researcher on this work, from IBM Zurich. In-memory calculations are the logical next step for reducing power consumption while increasing performance. These gains are necessary for hardware to keep up with advancements in artificial intelligence.

IBM’s new analog chip is based on phase-change memory. The key ingredient is a material that can undergo phase changes in response to electrical current. Typically, these are alloys of germanium, tellurium, and antimony. In one phase, which is conductive, the atoms are lined up nicely. In the other phase, which doesn’t conduct electricity, the atoms move around, heated locally by current, and become jumbled.

A phase-change material held between two electrodes doesn’t switch completely between ordered and jumbled arrangements like ones and zeros. Instead, at any point in time, there is a mix of both: The overall resistance of the material is determined by the size of the regions where atoms are jumbled.

“We’re coding information in terms of atomic arrangements,” says Sebastian. The weights of a neural network, for example, can be stored and accessed as the resistance in a phase-change memory device.

But these resistances suffer from drift and fluctuation. Because current passes through the phase-change material when information is read, the jumbled regions change a little bit every time—which has limited the precision and practicality of such devices.

To circumvent this problem, the IBM researchers introduced a so-called projection segment to the phase-change memory device. First proposed in 2015 by the same team, the projection segment is a conducting layer of metal nitride that wraps around a phase-change material core and runs parallel to it between electrodes. The projection segment separates the information writing and reading processes.

This projection segment does nothing when information is being written; all the current runs through the phase-change material to tweak the jumbled regions. But when information is being retrieved, the current flows through the projection segment and around the jumbled regions, leaving them untouched and preserving the information that’s stored. “That is the key innovation here,” says Sebastian.

The researchers tested a single-layer neural net on an 8-bit chip composed of 30 phase-change memory devices to identify pictures of the digits 1, 0, and 4, and achieved 100 percent classification accuracy. While it is still premature, Sebastian estimates the advance could potentially bring some 100 to 1,000-fold gains in power savings to future devices, compared with traditional computing.

Precision was sought in traditional computing, but with artificial intelligence, there is now an opposite trend. IBM is also reporting a digital chip today that operates at 8 bits while maintaining accuracy in neural net training. That models more closely to the human brain, which often can draw correct conclusions from little information.

IBM’s vice president for research, Jeff Welser, likens this to looking out a foggy window and seeing a blurry person walking toward your house. “As soon as you recognize your mom, it doesn’t matter how low-precision the image is,” says Welser. “You’ve got the right information you need.”

The Conversation (0)

Why Functional Programming Should Be the Future of Software Development

It’s hard to learn, but your code will produce fewer nasty surprises

11 min read
Vertical
A plate of spaghetti made from code
Shira Inbar
DarkBlue1

You’d expectthe longest and most costly phase in the lifecycle of a software product to be the initial development of the system, when all those great features are first imagined and then created. In fact, the hardest part comes later, during the maintenance phase. That’s when programmers pay the price for the shortcuts they took during development.

So why did they take shortcuts? Maybe they didn’t realize that they were cutting any corners. Only when their code was deployed and exercised by a lot of users did its hidden flaws come to light. And maybe the developers were rushed. Time-to-market pressures would almost guarantee that their software will contain more bugs than it would otherwise.

Keep Reading ↓Show less
{"imageShortcodeIds":["31996907"]}