Guarding Without Guardians

Bruce Schneier is concerned that without trust, society itself may be impossible

Loading the podcast player...

Steven Cherry: Hi, this is Steven Cherry for IEEE Spectrum’s “Techwise Conversations.”

Socrates famously asked if a person could lead a just life in an unjust society. A new book, Liars & Outliers, by Bruce Schneier doesn’t in so many words raise the question, Can a person lead a secure life in an insecure society? but it does answer it. There’s only so much we can do without there being a framework of trust: There have to be moral codes; peer pressures are needed; institutions have to have their own codes of conduct, and so on.

It’s hard to imagine such a book being written by anyone but Bruce Schneier, one of the world’s foremost authorities on security. He started out in cryptography and published some world-class algorithms, but he quickly came to realize that the mathematics was rarely the weak link in the security chain. His books, starting with the best-selling Applied Cryptography, then Secrets & Lies and Beyond Fear, have so widened the scope of his interests that in the new book he finds himself borrowing the viewpoints of the psychologist, the sociologist, and the anthropologist as he searches for the foundations of security. He’s my guest today by phone from Minneapolis, and we’re very happy to have him. Bruce, welcome to the podcast.

Bruce Schneier: Thanks for having me.

Steven Cherry: Bruce, I started out by couching your book in terms of a question you don’t explicitly ask: Can a person lead a secure life in an insecure world? Was that a fair way of characterizing it?

Bruce Schneier: You know, I think so. Really in the book I’m setting out to understand why security exists and why it helps to keep society running. So it’s less, Can you live a secure life in an insecure world? And more whether we as a society can survive even though some of us are going to try to take advantage of the rest of us.

Steven Cherry: Yeah, you know, that kind of takes us back to the title, which I wanted to ask you about—Liars & Outliers.

Bruce Schneier: Well, I always look for titles that are evocative, so that’s less descriptive and more, I’m trying to set a tone that within our group of people—and whether it’s a family or a company or a neighborhood or a country—there are going to be some of us that are dishonest, who are social parasites, who are liars. And there are also going to be some of us that don’t believe in what the group is doing, and those are the outliers. So maybe that’s an abolitionist in the antebellum South in the United States; it’s someone who doesn’t believe in whatever the group morals, the group mores, are. So I’m trying to look at those sorts of people and how society deals with them. Because in the pre–Civil War American South, someone who freed slaves was breaking the law, was doing something illegal. I mean, we know now he was doing something moral, but back then many people thought he wasn’t.

Steven Cherry: So tie this back with security for us.

Bruce Schneier: So if we postulate that society has these group behaviors, right? We’re going to share food at our table, we’re all going to use paper money and we’re not going to steal it from each other, we presume that there are these social rules that we all have to follow for it to work. But there are going to be some of us some of the time who are going to break those rules, right? We’re going to be gluttons at dinner, we’re going to become burglars and steal things. That’s a universal truism that’s never going to go away, right? But if there are too many of these people, society collapses. If you put dinner down on the table and one person takes all the food and eats it every night, you’re going to stop putting the food down on the table because nobody else is eating. So we use security measures to keep society in line. To keep the number of people who break those rules—and in the book I call them “defectors”—from taking over, right? We can never keep the burglary rate down to zero, but we have to keep it low enough so the rest of us can feel secure in our possessions.

Steven Cherry: It sounds like there’s kind of—I don’t want to call it a hierarchy of security, but a sort of progression between people’s internal moral codes on the one hand and then their sort of external codes at the peer pressure level, the institutional level, and then finally the security that we mostly think about. Can you just sort of take us through that? And it occurs to me that there’s sort of a way in which one substitutes for the other when it has to.

Bruce Schneier: You know, I think it’s very complicated, and there is sort of a progression. We are a social species, and as a social species we’ve developed a bunch of ways to deal with this problem. And it’s not just us: Other primates, other animals, exhibit the same behaviors. So there’s something that we call morals, and we can argue what that is really, but it’s sort of that internal feeling that says, “Stealing is wrong.” You know, a lot of us don’t steal, not because we can’t get away with it but because we kind of like the other person and we don’t want to take something that isn’t ours. And we believe that’s an immoral thing by whatever way we get there, right? Most of us are honest and cooperative because we like doing that, and that’s internal. For other things, we don’t do this bad behavior because of what other people will say. If I’m a glutton at dinner, what’s likely to happen before the group decides we’re no longer going to have dinner for everybody is, they’re going to kick me out of the table. There’s going to be some kind of social ramification, and I call this reputation, right? If a merchant cheats customers, he’s going to get a reputation for cheating customers, and you know customers won’t visit his shop, right? And that works, as far as it goes. There are places where it doesn’t, of course, and then we have institutional measures, laws, right? So it will be illegal to do certain things, illegal to steal, illegal to assault, illegal to cheat customers, and that will give us even more security. And on top of that there will be technical systems. In extreme cases, you can wear a bulletproof vest, you can have locks on your doors—all of these measures. And yeah, there’s kind of a hierarchy, right? You know, the first two—morals and reputation—you have in primitive society. To have laws you need a more organized society, technological measures you need technology, but really they’re all working hand in hand. I mean, why does commerce work in our country? It’s a combination of those things, of morals, of laws, of reputation, of technology. And times when commerce fails, when we have serious problems, like, you know, the banking crises in 2008, you can look at the failure in one or more of those dimensions and sort of see it happen.

Steven Cherry: Andrew Odlyzko, a University of Minnesota mathematician who’s written for Spectrum, said that the book “provides valuable new insights into security and economics.” I’m not actually sure what he meant by that, but it does seem to me that there is a kind of economic trade-off here as well. That sort of personal trustworthiness is sort of the cheapest way for society to gain a lot of security, and when that fails, the sort of corporate or other institutional trustworthiness are next best, and basically if I can’t just trust word of mouth and the yellow pages and I have to do an actual formal background check every time I need a plumber or a pizza delivery, security just becomes much, much more expensive and complicated.

Bruce Schneier: And it does. But in a sense, we do [a] little more than the yellow pages sometimes—maybe we ask a friend, right? That’s a reputational measure: We get a personal recommendation. Or maybe we go on a site like Angie’s List which is a much more formal reputational system, or maybe we go to the Better Business Bureau or look for a license. I mean, you think of all of these things we might do before we choose a plumber. And, yes, it’s more work, I mean, ideally you go to the yellow pages, you pick someone, and likely that’s going to work, but you know for more intimate things we might add some more security. So there’s definitely a lot of economics in this. I mean, it’s no coincidence that we’ve had an economic security workshop running for the past decade or so looking at how economics and security affect each other, and there’s a lot the security community has learned from economists and vice versa.

Steven Cherry: At one point in the book, you talk about the Lockheed Skunk Works group. I wonder if that’s an example of these kinds of trade-offs within a company.

I mean, people didn’t have to sort of sign off for every paper clip they used, and that made them a lot quicker and more efficient as a group.

Bruce Schneier: And that’s because you’re trusting them more, and you can decide to do this even though you don’t trust them. Let’s take an easy example: A company has an expense-reporting procedure by which employees have to allocate and explain how they’re spending company money—let’s say for travel. Now, implementing that actually costs money: The employees have to fill out the forms, you need people to verify the data, and there’s an expense. You can as a company just decide to trust the employees—you know, they tell you how much they spend, you cut them a check. Now, there’s going to be some fraud. Most people are going to be honest, some people will cheat a lot, some people will cheat a little, some people are going to be sloppy, but you could as a company save money because the cost of the fraud might be cheaper than the verification that there is no fraud. And you see that same kind of trade-off in government benefits. You know, we can ensure that nobody gets the benefit that doesn’t deserve it, but the cost of ensuring that might be greater than just giving them the benefit.

Steven Cherry: Stepping back a bunch, some people think that societies collapse when they get so mature they’re burdened more than they’re benefited by infrastructures and institutions, and I wonder if there’s any truth to that—it has a lot to do with security. I mean, we’re burdening ourselves with metal detectors at airports and schools and triple-packaging bottles of aspirin and maintaining standing armies.

Bruce Schneier: You know, it’s an interesting question, and I’m not a poli-sci person and I don’t know the answer to that. But certainly society does impose a lot of costs on itself the bigger it gets, and it’s not just society as a whole, you can see it in corporations—that as they get larger, the internal costs of maintaining their structure become an ever increasing percentage of their overall costs. And there certainly seems to be a limit in size after which an organizational structure will collapse. There are very human limitations; Robin Dunbar has studied groups and group dynamics and has identified several levels of natural societies, and these seem to be technologically invariant. They seem to be dependent on how our brains organize information with other people. A hundred and fifty is the Dunbar number, and you see it again and again in societies, and there are other numbers. So certainly security has an effect here, exactly what I don’t know.

Steven Cherry: Bruce, you and I have a small connection that goes back quite a ways, and this is something of a personal disclosure to our listeners here. I was a cofounder of an online activist group in the 1990s called Voters Telecommunication Watch, and you were on its board. It was part of a cluster of organizations—the Electronic Frontier Foundation was another one—that were formed at the time because it wasn’t clear that speech and privacy rights were going to carry over to the Internet. I’m just curious: How do you think we’ve done? And I have privacy in mind more than free speech.

Bruce Schneier: You know, I think we’ve done mixed. We had a belief back then that geography didn’t matter, that somehow the Internet transcended geography. It turns out that’s wrong; geography matters more than ever. That being said, the Internet has been an amazing platform for free speech and open communication and cross-cultural fertilization, and it’s done some amazing things. At the same time, the Internet has more censorship than ever, it has more control than ever by media companies, and we have the potential with some of the new laws we’re seeing, some of the new platforms we’re seeing, of losing the freedoms we worked hard for and we’ve gotten. So I think the jury’s still out. I think there’s some really good, there’s some bad, and a lot of fog in the future.

Steven Cherry: Yeah, relating this back to the issue of trust and security: You know, Facebook and Google both want your real names now, we have electronic tolls and credit cards instead of cash, there’s a lot of ways in which we’re using technology to replace trust with more external security measures.

Bruce Schneier: We are, and it’s a couple things going on. I mean, technology is allowing trust to scale, so an easy example is, maybe a few generations ago you would walk into a bank and you’d apply for a loan, and you’d get that because the bank officer somehow knew you—they were both in the same community, right? That system of personal trust is replaced by a computer database and a credit score. Now, that allows banking to scale. When you get a home mortgage, you probably get it from a bank halfway across the country from an officer you’ve never met, and this all works because of this database which collects information that can be used to make a trust decision, right? And that works pretty well, although sometimes it fails pretty spectacularly, and everybody knows stories of credit scores gone wrong. So yes, institutions like Facebook, like Google, like credit rankings, all allow trust to scale nationally and globally, and they work okay, and they fail badly. But it’s going to be interesting to see in the coming years not just Facebook and Google but Amazon and Apple trying to sort of own that trust relationship. You’re going to watch companies come out with electronic wallets where they’re going to try to sort of try to become banks. They’re going to try to own you and your friends and that communication and that interest because there’s a lot of value in that, and unfortunately now we have to trust those companies, that they will use that information for good and not for evil. And the jury’s still out on that question as well.

Steven Cherry: Very good. Well, Bruce, thanks for joining us. We’ll be waiting for your next book, which I guess will be about ethics and epistemology at the rate you’re going.

Bruce Schneier: You know, it’s very funny. My career has been an endless series of generalizations, and writing this book I felt like I was rummaging through a university, kicking down doors and asking questions, right? I would visit the psychologists, the anthropologists, and the archaeologists, and the theologians and the sociologists, and I’m not sure what I’m going to do next. I’m always looking for the meta-meta-meta picture, and right now this is as meta as I can conceive.

Steven Cherry: We’ve been speaking with security guru Bruce Schneier, author of a new book, Liars & Outliers, about the systems of trust that make society possible. For IEEE Spectrum’s “Techwise Conversations,” I’m Steven Cherry.

Announcer: “Techwise Conversations” is sponsored by National Instruments.

This interview was recorded 6 February 2012.
Audio engineer: Francesco Ferorelli
Follow us on Twitter @techwisepodcast


NOTE: Transcripts are created for the convenience of our readers and listeners and may not perfectly match their associated interviews and narratives. The authoritative record of IEEE Spectrum’s audio programming is the audio version.

Advertisement
Advertisement