Preparing for the Worst

Charles Perrow, known for his study of industrial accidents, turns his attention to terrorism

The Next Catastrophe: Reducing our Vulnerabilities to Natural, Industrial, and Terrorist Disasters

By Charles Perrow; Princeton University Press, 2007; 388 pp.; US $29.95

We are not safe. Nor can we ever be fully safe, for nature, organizations, and terrorists promise that we will have disasters evermore.” So concludes this important and chilling book by Charles Perrow, professor emeritus of ­sociology at Yale University.

Perrow is famous for his book Normal Accidents: Living With High-Risk Technologies , originally published in 1984. In it he argued that most major industrial disasters could be traced not to simple operator error but to the vulnerabilities of what he called complex, highly coupled ­systems, where each part depended on many ­others. He showed how small and apparently disconnected ­failures could cause such a system to fail ­catastrophically and ­unpredictably. So unpredictable are these systems that an effort to prevent one mode of failure may inadvertently create another one.

The United States abounds in complex systems teetering on the edge of disaster; what might happen if terrorists were to put their thumbs on the scale?

In The Next Catastrophe , Perrow argues that the United States abounds in complex systems teetering on the edge of disaster, and he ­wonders what might happen if terrorists were to put their thumbs on the scale. Chlorine and other toxic chemicals are stored near big cities and transported through them on poorly guarded trains; if vandals can spray graffiti on a railroad tank car filled with chlorine, what might a terrorist do? Raw milk, too, is stored and trucked about in large, poorly guarded tanks to which a terrorist could add a few grams of botulinum toxin, sickening or killing thousands of people. More than half the output of a major coalfield in Wyoming crosses over a single railway bridge, whose loss would be economically catastrophic.

He argues that the problem is being aggravated by the concentration of ­economic and political power, which tends to create targets and increase the magnitude of the disasters should something go wrong. Perrow cites as an example the world’s standardization on Microsoft Windows, which he compares to the dependence on a single crop, a blight on which could threaten the livelihoods of millions of people.

He also points to chemical plants, which because of economies of scale have grown to gargantuan size, ­storing ever larger quantities of hazardous chemicals on-site, and to the ­deregulation and restructuring of the electric utility industry, which has forced managers to worry more about short-term earnings than long-term maintenance. At the same time, Perrow says, government is unable or unwilling to force industry to reduce the potential for catastrophic accident.

Perrow thinks that major terrorist attacks are rare and difficult to forestall, whereas natural disasters are common and much easier to plan for. We know for sure that hurricanes will frequently hit the Gulf Coast, that trains will sometimes derail, that nuclear power plants are vulnerable to catastrophic failure, that floods and earthquakes will sometimes occur. (In the book, he calls his gripping account of a near-catastrophe in a nuclear plant ”We Almost Lost Toledo.”) For these reasons, he concludes, we should worry more about such disasters and less about terrorism.

And, Perrow says, the focus on fighting terrorism has increased the threats from other potential disasters. Grants that once went to train and equip first responders to disasters are now funding antiterrorism efforts of dubious efficacy, leaving a government that is at all levels less competent to plan for and respond to disasters.

One sorry example is the U.S. Federal Emergency Management Agency (FEMA), a competent agency for disaster management under the Clinton Administration. However, when FEMA was subsumed under the Department of Homeland Security (DHS) in a massive reorganization after the 2001 terrorist attacks, its focus shifted to terrorism, and it was caught unawares by the Katrina hurricane.

The federal government postponed the air evacuation of Katrina victims from New Orleans, Perrow reports, until enough air marshals could be rounded up to prevent the evacuees from hijacking the planes, and then confiscated their cans of emergency rations as a security measure. Perrow views the DHS as ”designed to fail,” a dumping ground for political appointees, hostage to congressional pork barrel politics, staffed by dispirited employees.

While more competent administrations might improve the effectiveness of disaster-relief organizations, Perrow argues, we should not count on it. He calls instead for sweeping changes in our infrastructure to reduce the impact of future disasters. Companies should redesign industrial systems to have increased redundancy and diversification. They should reduce the size of storehouses for hazardous chemicals and shift to less toxic chemicals. He advocates closing nuclear power plants near major urban areas and moving people out of areas like New Orleans that are at high risk of flooding.

He also calls for more assessment of our vulnerabilities. ”The comical effort of the DHS to do this is scandalous—allowing states to declare petting zoos and flea markets as terrorist targets,” he writes. Engineers are well placed to play a constructive role in uncovering weak points in our industrial fabric. But, he continues, ”technical people are unintentionally complicit in this by providing overly optimistic analyses, blaming the user, and avoiding taking responsibility for dealing with the real limitations of all systems today.”

One would think that the sight of railroad cars filled with chlorine passing near the U.S. Capitol building, in Washington, would motivate Congress to take effective action. Apparently not (see ”Nine Cautionary Tales,” IEEE Spectrum, September 2006).

Perrow does not address the problem of building the political resolve to put things right. Perhaps there is no way to do it. A well-known psychological effect, called the availability heuristic, leads people to predict the frequency of an event by the ease with which an example can be brought to mind. This effect may explain why huge resources are devoted to preventing terrorism on airlines, whereas many other potential calamities are ignored. How many will have to die before government takes action?

About the Author

An IEEE Fellow and a professor of bioengineering at the University of Pennsylvania, Kenneth R. Foster has been a frequent reviewer of books and software for IEEE Spectrum. In this issue, he considers the latest work of Charles Perrow [p. 28], best known for his study of industrial accidents. Most recently for Spectrum, Foster cowrote ”RFID Inside: The Murky Ethics of Implanted Chips” [March 2007].

Related Stories

Advertisement
Advertisement