The Nuclear Weapons Complexities

We can simulate many systems components, except the most unpredictable of all


In a sign that nuclear weapons remain vital technologically and militarily, the U.S. government has launched a multidecade “modernization” program for the nation’s 4,571 nuclear weapons, at an estimated cost of up to US $1 trillion, $348 billion over the next 10 years alone.

“An aging nuclear force…has forced the need for a modernization program,” the Defense Science Board declared last December after President Barack Obama gained rare bipartisan support.

The aim is ambitious: to refurbish or replace every aspect of America’s land, sea, and air-based nuclear “triad,” and to deter any moves by Russia, China, and rogue states like North Korea to threaten the American homeland.

The project is expensive. For instance, the U.S. Navy’s 14 nuclear submarines—each armed with up to 240 nukes—are slated for replacement. The planned Columbia-class sub is expected to consume about one-sixth of the Navy’s entire shipbuilding budget for the next 30 years (about $100 billion in total).

Simulation of nuclear wars and warheads also means more money for the Lawrence Livermore and Los Alamos national laboratories, where sophisticated computer models are built. Simulations are vital to predicting real-time performance, yet they are no panacea.

Consider guided missiles—the Minuteman series—that sit in silos in North Dakota, Montana, and Wyoming. This year, the U.S. Air Force performed only three “live” tests, but only of missiles, not bombs.

If the United States fields new and improved nuclear weapons, will the government be more likely to use them? Also of concern: “These modernized forces are more capable than the originals,” and thus the renovation could ignite a dangerous new arms race, the Arms Control Association asserts.

Another risk of modernization is that a fully digital nuclear network might be more vulnerable to hackers. Communications systems for nuclear warriors were created in the 1950s and ’60s: In some missile silos, soldiers still pass around floppy disks. But while communicating with individual silos and submarines is cumbersome today, speedier, potentially hackable links between political decision-makers and weapons in the field could make unintended launches more likely.

In a world of “improving” nuclear weapons, there are wider lessons. Because these systems can never be fully tested in advance (thanks to a multilateral test-ban treaty), how can we know how much foresight is good enough? The human factor is also critical. Soldiers on subs or in missile silos, for instance, might choose to defy launch orders from political leaders they mistrust.

Or nuclear warriors, who have grown accustomed to drills and rehearsals, may simply disbelieve when there’s a genuine crisis. Consider how on 11 September 2001, the North American Aerospace Defense Command (NORAD), charged with protecting the United States from nuclear missile attacks from a command center inside Colorado’s Cheyenne Mountain, failed to respond to even the hijacked airplane that struck the Pentagon. Soldiers (captured on tape recordings since made public) kept asking over and over, “Is this real-world or exercise?” They repeatedly assumed they were in a training simulation. Only after all four hijacked aircraft had crashed did NORAD receive authorization to shoot down any threatening aircraft in order to save lives on the ground.

Uncertainty clouds the nuclear-weapons complex, and human performance represents perhaps the biggest unknown. Improving hardware may do no good unless it is accompanied by a similar emphasis on enhancing human software. Ultimately, humans will decide whether the human species has the capacity to manage powerful tools that can either protect or grievously harm us.

About the Author

G. Pascal Zachary is a professor of practice at Arizona State University’s School for the Future of Innovation in Society. The views expressed here are the author’s own and do not represent positions of IEEE Spectrum or IEEE.