When Good Engineers Write Bad Software

In a new book, veteran programmer Adam Barr explains why it happens and how to fix it

3 min read
photo of "The Problem With Software" by Adam Barr
Photo: Randi Klett

Our lives are plagued by software failures. Usually it happens on a personal scale, but sometimes the impact can be international in scope, as when an airline booking system goes down. With software such a critical part of modern life, the question remains: Why is so much of it so bad? That’s the question veteran programmer Adam Barr set out to explore in The Problem With Software: Why Smart Engineers Write Bad Code (MIT Press, 2018). IEEE Spectrum senior editor Stephen Cass asked Barr for the answers.

Stephen Cass: So what is the problem with software?

Adam Barr: Fifty years ago there was a NATO Software Engineering Conference, where the term “software engineering” was first advanced. Everyone decided that software engineering is not really engineering and we had to fix that. They had a conference the next year to try to solve the problem. But the industry people thought the academics were off in the cloud, and the academics thought the industry people were ignoring the real problems and just focused on cranking out software. That split has continued since then. So you don’t have an industry which is upheld by research coming out of academia, as you would normally expect for something labeled engineering. The goal of the book is to raise awareness, get industry people interested in talking more with academia, and vice versa.

S.C.: How does that split manifest in terms of the actual code that gets written?

A.B.: There’s a difference between small pieces of software and large pieces of software. Small being what you do in school, working with one or two people on some project. Large software is what industry makes, which is worked on by multiple people, and most importantly not necessarily by the same people over its lifetime.... They’re really very different in what you have to do. So people get to industry, and all these things like maintainability, readability, securability, manageability—they haven’t learned any of that and have to reinvent it.... Companies like IBM had been studying this in the ’70s, and had made some progress on turning software into an engineering discipline. That essentially all got thrown away. The invasion of people [during the personal computer revolution], from Bill Gates on down, basically ignored everything that came before them.

S.C.: There seems to be a constant search for a silver bullet to fix software, whether it be getting rid of the GOTO statement or Agile programming.

A.B.: As I was writing the book I could see that DevOps was acquiring that sense of “Oh, this is the one thing that will cure all software ills.” They’re all useful. Getting rid of GOTOs was good. Agile has some good things. DevOps is good in some ways. But they get hyped as the be-all, end-all solution. I think that’s because they’re not backed by academia.... What’s missing is the ability to discriminate and say, “In these cases, Perl is a good language to use. In those cases, Perl is a terrible language.” Instead, people say, “Oh, wow. I taught myself Perl and I wrote this 20-line script. I will now go use Perl for every programming problem that I encounter.”

S.C.: How can things get better? Has the shift to software as a service helped?

AB: Running software as a service clears away some of the myths. Because you can actually observe whether your software’s maintainable and secure and all those things. And you feel much more of the pain if it’s not, so it will hopefully knock a little sense into people.... In the life of a software developer, my concern is that by the time you get out of college you’ve succeeded without having to really learn a body of knowledge the way a lot of other engineering disciplines do. So there’s a period of time before you kind of clue in, and that could be avoided. [The academic side could teach] knowledge that was more relevant in the industry, so you’d be productive sooner when you started.

This article appears in the January 2019 print issue as “Why Do Good Engineers Write Bad Software?”

The Conversation (0)

From WinZips to Cat GIFs, Jacob Ziv’s Algorithms Have Powered Decades of Compression

The lossless-compression pioneer received the 2021 IEEE Medal of Honor

11 min read
Photo: Rami Shlush

Lossless data compression seems a bit like a magic trick. Its cousin, lossy compression, is easier to comprehend. Lossy algorithms are used to get music into the popular MP3 format and turn a digital image into a standard JPEG file. They do this by selectively removing bits, taking what scientists know about the way we see and hear to determine which bits we'd least miss. But no one can make the case that the resulting file is a perfect replica of the original.

Not so with lossless data compression. Bits do disappear, making the data file dramatically smaller and thus easier to store and transmit. The important difference is that the bits reappear on command. It's as if the bits are rabbits in a magician's act, disappearing and then reappearing from inside a hat at the wave of a wand.

Keep Reading ↓ Show less