The December 2022 issue of IEEE Spectrum is here!

Close bar

Is It Time For a Computer Industry Do-Over?

Peter Neumann, with DARPA backing, reinvents the personal computer with security in mind

3 min read
Is It Time For a Computer Industry Do-Over?

“If the computer industry got a do-over, what should it do differently?”

That is the subject of a feature article in today’s New York Times that profiles my long-time friend and mentor, Dr. Peter Neumann, who is the Principal Scientist at the Computer Science Lab of SRI International.  Peter may now be chronologically 80 years old, but he still manages to combine his 60 years of computing experience with the stamina of a 25 year old.

One—and I emphasize one—of Peter’s latest endeavors is working with DARPA’s Information Innovation Office on “CRASH,” or the Clean-slate design of Resilient, Adaptive, Secure Hosts. The DARPA web site describes CRASH as pursuing “innovative research into the design of new computer systems that are highly resistant to cyber-attack, can adapt after a successful attack to continue rendering useful services, learn from previous attacks how to guard against and cope with future attacks, and can repair themselves after attacks have succeeded.”

“Because the industry is now in a fundamental transition from desktop to mobile systems," says DARPA's program manager Howard Shrobe, "it is a good time to completely rethink computing,” especially in regard to improving computer security.

While it may appear that the research is a bit Don Quixote in nature, Peter points out in the article that, "We have not fundamentally redesigned our networks for 45 years. Sure, it would cost an enormous amount to re-architect, but let’s start it and see if it works better and let the marketplace decide.”

Tackling big computer problems that few want to take on has been a hallmark of Peter's career throughout, but improving computer security has been a special interest of his—and, for many in the computing industry, a highly irritating Hyde-park soapbox one as well. Peter has been warning, mostly in vain until recently, that the industry did not pay enough attention to the computer security threats posed by poor computer system and software design.

And, Peter has long said, the longer the industry ignored them, the larger those threats would grow. Nor can you design computer security in after the fact, a warning that industry may finally be willing to listen to as IT systems are being broken into on a regular basis. Just this week it was disclosed that someone had broken into South Carolina’s Department of Revenue and accessed as many as 3.6 million residents' Social Security numbers and 387 000 credit and debit card numbers.

Another of Peter’s long time interests has been what I call the risk ecology of computing, namely, the business, technological, social, political and personal risks that computing has created along with its tremendous benefits in each of those spheres. Since 1985, Peter has moderated the ACM Risk Digest (aka Forum On Risks to the Public in Computers and Related Systems), which has been a home for open discussions among academics, practitioners, and kibitzers alike on the who, what, where, when, why and how different computer-related risks have turned into sometimes pernicious problems. One of the more frustrating things, he told the Times, is that many of these computing risks, such as the security risk posed by buffer overflow possibilities (pdf), have been known for decades but are still routinely ignored.

You can listen to Peter, Steve Bellovin, Matt Blaze and myself discuss some of the recurring risks that turned into a plague of problems for the FBI’s Virtual Case File system development during an IEEE Spectrum Radio roundtable from a few years back.

Finally, the Times ran a concurrent piece on the beginnings of computer hacking—when hacking wasn’t a pejorative word—and Peter’s role in it. I think you’ll find both Times articles interesting, and offer a bit more insight into one of the great thinkers of the computer field.

The Conversation (0)

Why Functional Programming Should Be the Future of Software Development

It’s hard to learn, but your code will produce fewer nasty surprises

11 min read
Vertical
A plate of spaghetti made from code
Shira Inbar
DarkBlue1

You’d expectthe longest and most costly phase in the lifecycle of a software product to be the initial development of the system, when all those great features are first imagined and then created. In fact, the hardest part comes later, during the maintenance phase. That’s when programmers pay the price for the shortcuts they took during development.

So why did they take shortcuts? Maybe they didn’t realize that they were cutting any corners. Only when their code was deployed and exercised by a lot of users did its hidden flaws come to light. And maybe the developers were rushed. Time-to-market pressures would almost guarantee that their software will contain more bugs than it would otherwise.

Keep Reading ↓Show less
{"imageShortcodeIds":["31996907"]}