An Optimistic Take on Cybersecurity: Trust the Natural Evolution of Tech

The increasing complexity of gadgets baffles even their designers, but we are not all doomed

3 min read

Qualcomm's Alex Gantman addresses the IEEE Vision, Innovation, and Challenges Summit
Qualcomm's Alex Gantman addresses the IEEE Vision, Innovation, and Challenges Summit
Photo: IEEE Awards

The bad news: Engineers today are doomed to never fully understand their complex creations.

The good news: Cybersecurity efforts, thanks to the natural evolution of technical things, are not doomed.

That’s what Alex Gantman, Qualcomm’s vice president of engineering and head of product security, thinks. This assessment comes decades into a career spent coming up with movie-plot-quality scenarios of cybercrimes he thought were about to happen—but never came to pass.

“I picked up the nickname Dr. Evil,” he told an audience of IEEE leaders and honorees attending the IEEE Vision, Innovation, and Challenges Summit in San Francisco last week. Then, at some point, he turned into Mr. Optimism.

The reason for such a sunny view of the cyber world? He realized, he says, that just like organisms in the natural world, technology naturally evolves to become more secure against the threats against it.

Engineers, he suggested, need to trust a little more in this process of evolution, and stop thinking of themselves as gods.

“We are all doomed to never fully understand our creations,” said Gantman. “As engineers, we develop this God complex, a belief that we can build a thing to do what we want and just what we want, and that because we build something we can understand how it really works. But we don’t really understand the whole. The complexity of our technological creations is approaching that of living organisms.”

Believing in the evolution of technology, however, doesn’t mean sitting back and doing nothing, Gantman said. “We have a role to play in shaping and accelerating this evolutionary process.”

"As engineers, we develop this God complex, a belief that we can build a thing to do what we want and just what we want…. But we really don't understand the whole."

Part of that role, Gantman indicated, is advancing security science, that is, the ability to test and quantify security. Right now, he indicated, security science is lacking.

“The auto industry has data to assess safety, the health industry has data from the CDC,” he said. “But the security industry is based on cult traditions and untested best practices. And that has to change.”

A big part of improving the science of security, Gantman suggested, is investing in reverse engineering. And he sees the point of reverse engineering not just as figuring out how to duplicate a design, but as the key to greater understanding of what an engineer has created.

“Reverse engineering is the field of science dedicated to artificial creations,” he said. It allows us to “try to understand how complex systems work. In computer security, in particular, reverse engineering studies behaviors not known to the designers of the system.” That doesn’t mean, he clarified, that product designers don’t have to build security in from the beginning. They most certainly do. What it means is that there will always be vulnerabilities they couldn’t predict.

Consumer expectations are also evolving, Gantman pointed out, and these changing expectations will also push the trend towards better security.

“Security becomes a significant factor in purchasing decisions only once a product is commoditized,” he said. “Before that, it’s all about new features.”

Consider the evolution of automotive safety features, Gantman said. “Convenience features--like automatic transmissions, air conditioning, and dashboard radios—predated almost all of the safety features, like seat belts. Activism had a role to play (in the adoption of safety features), but it is important to understand why these advocacy efforts succeeded when they did.”

Or, he said, consider the personal computer: 

When you were buying a PC in the 90s, you would bring it home, start to set it up, install floppy disk 13 of 72, etc., and by the time you finished the setup, it was an out of date piece of junk and you wanted an upgrade. In that environment, every crash, every virus infection, brought you one excuse closer to doing what you really wanted to do—which was to throw it all out and get a new one.

But by 2000, with hardware stable, and Office having all the features you want, the aim was no longer to get a new machine. Instead, users wanted the ones they had already purchased to work, because upgrading is a chore, and it’s not worth it.

"Security is not free. When the rate of innovation is high, consumers don't want to pay or wait for greater security."

“It was no coincidence that Bill Gates’ Trustworthy Computing Memo came out in 2002—because that was when the PC market became commoditized.” (The memo, sent to every Microsoft employee informed them that going forward security, not features, was to be their priority.)

“Security,” Gantman said, “is not free. When the rate of innovation is high, consumers don’t want to pay or wait for greater security. When people are lining up on a product release date, delaying a release to improve security, for a vendor, is suicidal.”

But when innovation slows, security grows. And that is where we are right now with many of our devices, Gantman said. “Consumer devices are continuously evolving to become more secure, regardless of what platform you are on,” he told the Summit attendees. As a result, today, “the smartphone is the most secure digital device you have.”

The Conversation (0)