16 April 2009—Last week, a spot check of electric grid systems revealed that hackers had infiltrated the U.S. electric grid. The government inspections, motivated by a 2007 Idaho National Laboratory demonstration of the vulnerabilities of the U.S. grid, revealed more than the inspectors had bargained for: The invaders had left behind potentially disruptive malware. A former U.S. official told the Associated Press that the culprit was ”almost without a doubt” state sponsored, and a follow-up listed Russia and China as suspects (although the Chinese government emphatically denied the charge this week). The increasing threat from better-financed hackers, the growing need to build security into a chip at the start of the chip-design process (rather than as an afterthought), and the blurring line between military and civilian targets have been at the center of many U.S. Department of Defense concerns. The mounting hysteria has led the government to pour millions into the problem, and on 1 April, Congress introduced a bill that would let the president declare a ”cybersecurity emergency,” shutting down Web traffic to compromised infrastructure such as the power grid.
But the answer could be found in something decidedly less grandiose: Last month, Pleasanton, Calif.–based CPU Tech introduced into the commercial market a secure processor that had previously been available only for military systems. The Acalis CPU872 is the first microprocessor born of new methods the Pentagon learned from its hunt for secret kill switches in the commercial chips the agency buys. But beyond just defense contractors, CPU Tech is targeting commercial users of PowerPC processors at big firms and agencies including those responsible for securing public infrastructure, such as electric power generators and subway systems.
CPU Tech vice president Pat Hays says the Acalis CPU872 has three features that other secure processors don’t: First, instead of being integrated on the same board with the processor or functioning as a coprocessor like a graphics-processing unit, the Acalis puts the security muscle in the same chip with the processor. Second, each chip is manufactured at a trusted foundry certified by the U.S. Department of Defense. Finally, the security key—the ”secret handshake” that secure chips create to make sure trusted sources can gain access and other sources can’t—is generated on the chip in a way that thwarts two of the most common attacks on secure systems.
”We made sure a chip would be protected from people like us,” says Hays, referring to CPU Tech’s other business—reverse-engineering chips for military contractors. As a subcontractor on Trust in Integrated Circuits, a program of the Defense Advanced Research Projects Agency (DARPA), CPU Tech gleaned knowledge that it applied to the design of the CPU872 processor. ”Our work, in fact, laid the groundwork for TIIC,” says CPU Tech founder Ed King.
By necessity, the U.S. military uses many commercial off-the-shelf chips manufactured outside the United States because of a shifting integrated-circuits industry. Security experts have written extensively about the risk this creates for the country’s national security, as malicious hardware and software modifications can theoretically be introduced during the manufacturing process. But the line between military and civilian targets has blurred significantly in the United States as well as in other Western countries. Hacking into civilian infrastructure could lay the groundwork for organized, government-coordinated warfare. This goes beyond conspiracy theory: A September 2008 Idaho National Laboratory report found a link between the Russian attack on Georgia and a preceding wave of malicious cyberactivity, primarily distributed denial-of-service (DDoS) attacks. Even off-line structures are vulnerable to online sabotage. The same study established the alarming feasibility of gaining access remotely to electricity generators that are commonly thought to be immune to online threats.
Hays says the chip’s onshore pedigree reduces the risk of Trojan horses being built into the hardware during the manufacturing process. The 90-nanometer processor is fabricated at IBM’s Fishkill, N.Y., foundry—a trusted foundry—which goes through an exhaustive vetting process to be deemed secure enough to manufacture the U.S. military’s specialty chips, with security features built directly into the hardware.
But despite the built-in security, Hays, who was in Bell Labs’ digital signal processing division in the 1980s, says the Acalis chip won’t divulge its secrets, even if it’s reverse-engineered down to the mask layer. This is to prevent physical scrutiny if a chip falls into the wrong hands. In 2001, a U.S. Navy signals reconnaissance plane crashed on an island in the South China Sea. The plane was not returned to the United States until three months later, and when it was returned, it was in pieces. Military experts have speculated that billions of dollars in U.S. military R&D were compromised. CPU Tech anticipates a similar fate for the secure processors it releases into the commercial world. Buying or even stealing a server is much easier than bringing down a spy plane. ”We understand the chip itself will find its way to an untrusted source through distribution channels,” Hays says, so the chip was designed to be useless if it were to be physically reverse-engineered. Nothing can be gleaned from the hardware, which can be securely configured to work only in concert with CPU Tech’s proprietary security software. ”We’ll be a lot more restrictive about handing out the software that makes it work,” Hays says.
But reverse-engineering a physical chip is not the only way to break in or create countermeasures. To eavesdrop on communications or impersonate a genuine user, a malefactor needs only decipher the secret encryption key.
”Getting the key is the key, so to speak, to being able to generate encrypted data,” says David Blaauw, an electrical engineering professor at the University of Michigan, in Ann Arbor.
The key is generated using the National Institute of Standards and Technology’s Advanced Encryption Standard (AES) algorithm, certified by the National Security Agency. AES algebra is very secure. ”But when the key comes out of its cave to do its job once in a while,” Hays says, ”that’s when the key is most vulnerable.” Another way of getting the key is a so-called DRAM attack, in which a malefactor freezes the dynamic RAM to ferret out where the key has been temporarily stored. Cooling the DRAM after shutting off a computer can prolong a well-known property of DRAM called remanence—during which the key temporarily remains in memory—for several minutes, giving attackers an ample window to get into the DRAM and extract the key. ”Basic disk encryption is very vulnerable,” Hays says. ”Wherever you put the key, on a chip or in DRAM, the smart bad actor will find a way to get it.”
The hardware and software of the Acalis chip, Hays says, makes both of these methods impossible for two reasons: For one, the key can be changed at will. The system allows the owner to change the key every day or during every session to guard against side channel attacks. ”If you change the key often enough, by the time you figure out the key, it’s already irrelevant,” says Hays. ”There would be no chance to find it in time.”
Further, instead of depending on secrecy, the system merely ensures that there is no key to be found. ”Through a combination of hardware and software, we never actually store a key,” Hays explains. ”You obviously have to have something stored or some kind of index or value, but our system completely eliminates key storage.”
How the chip uses a key without storing it is something CPU Tech executives are not willing to divulge. ”Many of the details of our reverse-engineering prevention features are proprietary,” says King, ”but they include multiple methods of unique chip identification, on-chip memory, and on-chip sensors.” Acalis has two CPUs on one chip that operate independently. One processor can run nontrusted code while raising a hardware firewall to isolate it from the second CPU. In this mode, the only communication between the two is through the equivalent of a mailbox built into the chip.
Though the company has withheld the details of its process, some security experts are skeptical. ”They are addressing the right questions, but the things they are saying are not completely clear to me,”” says Blaauw.
Chris Tarnovsky thinks the idea is smoke and mirrors. The self-taught American engineer was catapulted into notoriety during a major piracy case last year, in which he was accused of reverse-engineering pay-TV security cards and posting the results online, thereby opening the door for hundreds of thousands of nonpaying individuals to get free pay TV. He was (mostly) exonerated and now runs Flylogic Engineering, which performs hardware and software security analysis. He says he has reverse-engineered enough government chips to be unimpressed by Acalis’s military pedigree. ”I doubt this chip is really that secure and is most likely obscurity mixed with security,” he says.
Blaauw also has his doubts. ”The idea of changing keys is good,” he says. But it needs to be changed quickly, before the key is cracked, and that becomes expensive. ”It’s much more expensive to change a key in terms of resources that to use a key to decrypt or encrypt data, so you may not want to do it every session.” Beyond that, it may be possible for a hacker with a good machine to decipher the key even with daily key changes. ”We made an encryption engine that was hardware based,” Blaauw says. ”We cracked it in minutes.”
As for the key storage trick, neither Blaauw nor Tarnovsky are convinced. ”I don’t see how,” says Blaauw. ”At some point in the encryption, you have to apply the bits of the key in the algorithm. If you didn’t store it, how do you retrieve it? You could distribute the bits or scramble them, but they need to be stored somewhere in some form— I don’t see an easy way around that.”
But for now, CPU Tech isn’t revealing more. ”We’re still trying to figure out the line between advertising what the chip can do and giving away the company store,” Hays says.
If everything they have described is accurate, Blaauw says it would be hard to crack the CPU Tech chip. ”It doesn’t sound like the other approaches people have proposed,” he concedes. ”They may have something that we don’t know about.”