Reviewed by Paul Wallich
By Bruce Schneier; Wiley, 2012; 384 pp.; US $25; ISBN: 978-1-1181-4330-8
Bruce Schneier is a security icon, the cryptological equivalent of action-movie superstar Chuck Norris, able to straighten elliptic curves with his bare hands. Liars & Outliers isn’t the book you’d expect from someone whose portrait adorns posters—nor from the coauthor of several important encryption algorithms (one of them a finalist for the next generation of national encryption standards).
On his blog, Schneier reminds us almost daily that protecting our secrets with a 4096-bit key doesn’t do much good if we have to tape the new pass phrase to our monitors, and that an unforgeable ID card can be a very bad idea if someone can get one by slipping 20 bucks to a file clerk. In Liars & Outliers, however, he takes an almost Aristotelian step back from those frontline concerns to discuss the first causes of security: the kinds of trust that security measures help to enable; why we secure things in the first place, even when—indeed, especially when—we know that security will never be perfect; and why we probably shouldn’t even want security to be perfect.
Schneier points out how remarkable it is that we do generally trust one another. We do it each and every day, and almost invariably that trust is repaid. Cable-TV technicians don’t pillage the houses they’re given free access to, certified public accountants don’t generally loot their clients’ bank accounts, and waiters don’t poison patrons, no matter how rude they are. Though the odds of getting caught are small, such people and their institutions almost uniformly behave honestly. Even if many of us have larceny in our hearts, we don’t exercise it with our hands.
By and large, locks, alarms, and the other measures we typically think of as "security" only come into play, Schneier argues, when social pressures such as altruism, ethical codes, and concern for reputation or self-image fall short. As the scale of society grows from local to metropolitan to global, however, these social bonds diminish and the need for security grows.
This sounds like an argument for maximum lockdown—at least in cities—limited only by the potential cost of overzealous enforcement. Yet Schneier shows how the alternative extreme would not only be maximally unpleasant but also fail to deliver the greatest security. Looked at from the perspective of game theory, reducing the number of social predators just makes the pickings richer for the "hawks" who remain. And from an anthropological viewpoint, it is the attackers’ very intelligence and resourcefulness that spurs defenders to progress. So we’re left with an unending arms race.
Then Schneier goes a step further: Because we are all simultaneously members of many different subsets of society with competing interests—family, friends, colleagues, party, nation—we need security failures for progress. If Lockheed Corp. executives in the 1940s had had the clout and the personnel to make sure that every engineer filled out all the forms and followed all the approval procedures that corporate rules normally require, the company’s fabled Skunk Works would have been impossible, and the P-38 fighter and a dozen other planes would never have been designed. If Muammar Gaddafi’s security forces had been able to perfectly triangulate every mobile Twitter message, opposition members would have been arrested after their first tweet.
In other words, sometimes rule breakers are a threat to society, and sometimes they are its best hope. If that sounds like Thomas Jefferson’s "I would rather be exposed to the inconveniences attending too much liberty than to those attending too small a degree of it," we can still thank Schneier for updating this eternal truth for a digital age.
For more book reviews, see the full list.
About the Author
PAUL WALLICH is a regular contributor to IEEE Spectrum who lives in Montpelier, Vt. In the past year, he has also written about the economics of lightbulbs, using a manual typewriter as a computer keyboard, and beehacking.