Chip Design Thwarts Sneak Attack on Data

Cache architecture harnesses the power of randomization

Old Way:

(left) Data from main memory is rigidly organized in the cache. When one section of the cache is full, new data sent to it kicks out the old data.  

Safe Way:

(right) Data from main memory is randomly organized in the cache, so there is less of a chance that incoming data will need to kick out old data.

Data encrypted on your computer should be safe, as long as you’re the only one with the key to the encryption. But one variant of newer—and sneakier—attacks can deduce the key by striking a vulnerable spot: the CPU’s on-chip memory, called the cache. Software-based attempts to bolster cache security have bad side effects, however, which can severely degrade the computer’s performance. Now researchers have developed a new kind of cache architecture that neutralizes these attacks. They found that combining the best qualities of the two main types of cache had amazing results: a secure cache that also handles data faster and consumes less power.

The new technology, called Newcache, developed at Princeton University by electrical engineering professor Ruby Lee and her graduate student, Zhenghong Wang, foils these so-called cache side-channel attacks by randomizing where data is stored in the cache.

A CPU’s cache copies data that the processor uses frequently from a computer’s main memory and temporarily stores it. This reduces greatly the amount of time needed to retrieve data, so it speeds up processing.

A cache organizes the incoming data logically to make it easy to find. ”If you throw data just anywhere into your cache,” says IEEE Fellow Mark Hill, a computer science professor at the University of Wisconsin–Madison, ”when you need to find it, you have to look for it everywhere.” So caches use an addressing system that can be conceptually compared to a rudimentary address book. ”Your friend Alice will be stored in the ABC section or nowhere,” he explains. The good news: Finding Alice is easy. The bad news: Because the cache is comparatively small, there is room for only one entry in the ABC section. Adding your friend Bob would kick Alice out of the cache. This is called a conflict miss, a term Hill coined in his 1987 Ph.D. thesis.

Here’s how a side-channel attack works to exploit that organization. An attacker fills the address book with his own entries, including, say, Bob. When you add a new ABC entry, Alice for instance, Bob is booted out. The attacker doesn’t know that the new entry is Alice, because of the encryption. But he can deduce that the new entry belongs in the ABC group, because it was Bob that was booted. By observing millions of conflict misses and using some algorithmic magic, an attacker can figure out the key that encrypts your data.

The way Lee’s new hardware design solves this issue is subtle, Hill says. From the outside, the new cache system appears to map entries as a standard cache would—Alice in ABC, Doug in DEF, and so forth. But what it’s really doing is putting new data into available slots in any part of the cache. That means Alice, Bob, and Carol can be stored simultaneously, because now Bob might populate the MNO slot and Alice the STU slot. Conflict misses will still happen when the cache is full, but they no longer carry the information a hacker could use to crack the code.

Before Lee’s work, no cache architectures had been designed specifically to fend off attacks. The only previous hardware security solutions revolved around what Lee calls ”unacceptable brute-force solutions” that slowed computers down. Because it reduces the number of conflict misses, Newcache improves performance even for applications that don’t need security. It also improves power consumption and fault tolerance and keeps parts of the chip cooler, she says. ”The key point is that Newcache improves performance as it improves security,” Lee says.

Lee, who is also an IEEE Fellow, says several processor firms have already expressed interest in the technology. But its adoption will depend on more than the fact that it works better and keeps data more secure, says David Kanter, a consultant at Real World Technologies. ”That means evaluating the overall costs and benefits of a secure cache, in terms of design complexity, performance, verification, and validation—not just whether it makes your cache more secure.”

Advertisement
Advertisement