Terror: What’s Next

Five years after 9/11, technology's role against terrorism is still murky

3 min read
Opening illustration
Illustration: Brian Stauffer

Special Report: Technology and Terrorism

Image: Brian Stauffer

The defining conflict of the late 20th century, the Cold War, was all about technology. It revolved around nuclear weapons—speci­fically, the technology needed to make, store, test, and deploy them. And, of course, to spy on what the other side had.

The defining conflict of the early 21st century, against extremist terror, may or may not have much to do with technology. All of the major recent attacks—New York and Washington on 9/11/01; Bali in 2002; Beslan, Russia, and Madrid in 2004; London in 2005; and Mumbai in 2006—required no significant technological sophistication to pull off. At the same time, however, some terrorist groups have proven extraordinarily adept at using the Internet, networks, and digital video to recruit, plot, and communicate.

Meanwhile, in the developed countries that the extremists are doing their best to terrorize, officials have launched research programs that will in coming years help determine how much of a role advanced technology can have in this struggle. In this three-part report, we consider that very issue from several different angles.

First, Senior Editor Harry Goldstein reports on one of the most intriguing of the tech-based antiterror initiatives, in which programmers are writing computer simulations that attempt to model the minds, behavior, and networks of militiamen and terrorists. The half-dozen projects have different goals. One functions like an elaborate video game to give soldiers an idea of how combatants from another culture will react to, say, an attempt to capture one of their leaders. Another project, based at Carnegie Mellon University, in Pittsburgh, is modeling specific insurgent networks in Iraq to help intelligence officials determine at any instant whom they should kill or capture to do maximum harm to the network. As Goldstein discovered, circumstantial evidence, at least, suggests that the software helped guide the rapid series of raids on Abu Musab al-Zarqawi’s network in Iraq immediately after he was killed in a bombing this past June. [See ”Modeling Terrorists.”]

In the second part of the report, Senior Editor Jean Kumagai has assembled a series of nine ­terrorist-attack scenarios. These sketches are all based on ideas that came out of dozens of interviews with experts, as well as from roundtable discussions that Kumagai and Goldstein convened this past May at Stanford University and at Global Business Network, a San Francisco consultancy that coaches corporate and government clients in matters related to strategy and future scenarios. (You can hear excerpts from these roundtables by going to /radio.) The discussions included some of the top U.S. risk-analysis and counterterrorism specialists, including Michael May, director emeritus of the Lawrence Livermore National Laboratory and professor emeritus at Stanford, and Gary Ackerman of the Center for Terrorism and Intelligence Studies, in San Jose, who directs a research program there on terrorism and weapons of mass destruction. [See ” Nine Cautionary Tales.”]

Could the hypothetical terrorist scenarios that we develop in this section do harm by giving terrorists ideas they may not have thought of? We considered that possibility but ultimately decided to use that concern to shape the section rather than to avoid doing it at all. We chose our scenarios based on several criteria—for example, all have been suggested, in one form or another, somewhere in the open literature. Also, in the end we were swayed by the belief that exposing a danger is, in the long run, better than ignoring it while hoping that terrorists won’t notice it. Consider the unlocked and flimsy cockpit doors on commercial airliners before 9/11. It is painfully clear now that they should have gotten a lot more attention than they did.

The report closes with an essay by Charles Perrow, a professor emeritus at Yale University, who argues that spending large sums on counterterrorism is illogical. He considers the frequency and damage of terrorist attacks, concluding that they are by any measure far less costly to developed countries than are natural disasters and industrial mishaps.

You may have heard of Perrow or his theories; his 1984 book, Normal Accidents, is a landmark on the subject of organizational complexity and the likelihood of technology-related disasters. Whether you agree with him or not, his essay is a bracing reality check. As he argues persuasively, terrorism is by no means paramount in the array of challenges and problems that beset developed societies. [See ” Shrink the Targets.”]

So what does it all mean for the role of technology in this global clash? Certainly, of the scenarios described in the second section of our report, few have obvious technical countermeasures, as we note. Nevertheless, superior technology is one of the many advantages developed societies have over terrorists, insurgents, and militiamen, and it will undoubtedly be used to track them, spy on them, and fight with them.

As Goldstein’s article suggests, the places where technology winds up being used successfully may be surprising. Just as few people before World War II could possibly predict that machines would help break codes and let bombardiers consistently hit their targets, not many can say now where technology will be deployed to maximum effect in the struggle against extremism. All we can do is hope that, as with World War II’s technology, it will serve its intended purposes well and have unforeseen benefits long after the conflicts that inspired it have faded away.

The Conversation (0)

Q&A With Co-Creator of the 6502 Processor

Bill Mensch on the microprocessor that powered the Atari 2600 and Commodore 64

5 min read
Bill Mensch

Few people have seen their handiwork influence the world more than Bill Mensch. He helped create the legendary 8-bit 6502 microprocessor, launched in 1975, which was the heart of groundbreaking systems including the Atari 2600, Apple II, and Commodore 64. Mensch also created the VIA 65C22 input/output chip—noted for its rich features and which was crucial to the 6502's overall popularity—and the second-generation 65C816, a 16-bit processor that powered machines such as the Apple IIGS, and the Super Nintendo console.

Many of the 65x series of chips are still in production. The processors and their variants are used as microcontrollers in commercial products, and they remain popular among hobbyists who build home-brewed computers. The surge of interest in retrocomputing has led to folks once again swapping tips on how to write polished games using the 6502 assembly code, with new titles being released for the Atari, BBC Micro, and other machines.

Keep Reading ↓ Show less

Spot’s 3.0 Update Adds Increased Autonomy, New Door Tricks

Boston Dynamics' Spot can now handle push-bar doors and dynamically replan in complex environments

5 min read
Boston Dynamics

While Boston Dynamics' Atlas humanoid spends its time learning how to dance and do parkour, the company's Spot quadruped is quietly getting much better at doing useful, valuable tasks in commercial environments. Solving tasks like dynamic path planning and door manipulation in a way that's robust enough that someone can buy your robot and not regret it is, I would argue, just as difficult (if not more difficult) as getting a robot to do a backflip.

With a short blog post today, Boston Dynamics is announcing Spot Release 3.0, representing more than a year of software improvements over Release 2.0 that we covered back in May of 2020. The highlights of Release 3.0 include autonomous dynamic replanning, cloud integration, some clever camera tricks, and a new ability to handle push-bar doors, and earlier today, we spoke with Spot Chief Engineer at Boston Dynamics Zachary Jackowski to learn more about what Spot's been up to.

Keep Reading ↓ Show less

How to Write Exceptionally Clear Requirements: 21 Tips

Avoid bad requirements with these 21 tips

1 min read

Systems Engineers face a major dilemma: More than 50% of project defects are caused by poorly written requirements. It's important to identify problematic language early on, before it develops into late-stage rework, cost-overruns, and recalls. Learn how to identify risks, errors and ambiguities in requirements before they cripple your project.

Trending Stories

The most-read stories on IEEE Spectrum right now