Why Hardware Engineers Have to Think Like Cybercriminals, and Why Engineers Are Easy to Fool

A man in a brown jacket and tie gesticulating
Photo: Tekla Perry
Scott Borg, director of the U.S. Cyber Consequences Unit

The future of cybersecurity is in the hands of hardware engineers. That’s what Scott Borg, director of the U.S. Cyber Consequences Unit, told 130 chief technical officers, engineering directors, and key researchers from MEMS and sensors companies and laboratories Thursday morning.

Borg, speaking at the MEMS and Sensors Technical Congress, held on the campus of Stanford University, warned that “the people in this room are now moving into the crosshairs of cyberhackers in a way that has never happened before.”

And Borg should know. He and his colleagues at the Cyber Consequences Unit (a nonprofit research institute) predicted the Stuxnet attack and some major developments in cybercrime over the last 15 years.   

Increasingly, hackers are focusing on hardware, not on software, particularly equipment in industry, he indicated.

“Initially,” he said, “they focused on operations control, monitoring different locations from a central site. Then they moved to process control, including programmable logic controllers and local networks. Then they migrated to embedded devices and the ability to control individual pieces of equipment. Now they are migrating to the actual sensors, the MEMS devices.”

“You can imagine countless attacks manipulating physical things,” Borg said. And imagining those things definitely keeps him up at night—it’s not easy being a cybersecurity guru.

“Yesterday,” he said, while on a tour of a nanofab facility, “I saw tanks full of dangerous chemicals, controlled by computers moving things in and out. I immediately thought about which would be the prevailing direction of wind and how you could rupture the tanks with cyberattack. Whenever I look at an appliance, I think what could be done to it that causes maximum damage and embarrassment.”

The move to attacking hardware, just like any cyberattack, comes because hackers are thinking about the economics, Borg says. Hackers always profit in some way from their attacks, though the gain is not always monetary.

One way hardware hackers can profit by hurting a company can be by taking advantage of the resulting drop in its stock price; stock manipulation is a growth area for cybercrime in general, says Borg.

“There is a limit to how much you can steal from credit card fraud; there is no limit to how much you can make in taking a position in a market and making something happen,” Borg says. “You can short a company’s stock in a highly leveraged way, then attack the company in a way that makes stock fall, reinvest on the way down, and multiply your investment hundreds of times. This is a big growth area for cybercrime; it has been done multiple times already, but it is really just starting to get under way. This is going to be a huge area for cybercriminals.”

It is going to be up to engineers to stop this coming hardware cybercrime wave. And it’s not going to be easy because “engineers aren’t as easy to fool as scientists, but they are still really easy to fool.

“Engineers believe in data, in gauges, in measurements. They are a little less easy to fool than scientists in that they build physical systems that operate, and when they fail, they do have to try to figure out why and what real world effects are. But engineers aren’t used to dealing with unkind adversaries. They believe in statistics, where statistical distributions are normal, where probabilities can deal with independent variables. And statistics doesn’t work in a cyberworld. If you are up against a cunning adversary, who will behave in ways outside of normal, it is hard to use any of the techniques we use in the natural world. A cyberadversary will take advantage of unlikely circumstances.”

But, he said, if engineers, particularly design engineers, learn to understand the cybercriminal and think proactively about cyberattacks, they can often improve cybersecurity and do it for free.

“Increasing security isn’t always about layering on security [to a completed system], but about how you implement a certain function in the first place, and that choice often doesn’t cost more,” Borg says. “Decisions that are made in engineering at really fine-grained levels affect the costs of carrying out a cyberattack. Even a small sensor will have consequences for cybersecurity, not always in the immediate device, but as it develops into a product line.”

Engineers, therefore, need to look at their products from the standpoint of the attacker, and consider how attacker would benefit from cyberattack and how to make undertaking that attack more expensive. It’s all about working to increase an attacker’s costs, he says. 

“As we move into embedded controllers and microdevices, we move into a realm that cybersecurity specialists like me haven’t explored that much yet,” he says. “The hackers haven’t explored it yet either,” but, Borg warns, they will.

“You people are now in the crosshairs; [design] decisions you are making will have powerful security implications. They will in some cases wipe out your competitive advantage, or give you a huge one. Nobody can tell you what to do beyond what I’ve told you—that it’s all about the economics,” he says. “All I can do is make you aware of the world we have moved into, to make you aware that you are now in the crosshairs.”

A condensed version of this article appears in the July 2017 print issue as “To Design Better Hardware, Think Like a Cybercriminal.”  

Advertisement

View From the Valley

IEEE Spectrum’s blog featuring the people, places, and passions of the world of technologists in Silicon Valley and its environs.
Contact us:  t.perry@ieee.org

Editor
Tekla Perry
Palo Alto, Calif.
Advertisement