Schneier: Science fiction writers can help, or hurt, homeland security

Published 19 June 2009

If you begin with the assumption that 9/11 was a failure of imagination, then who better than science fiction writers to inject a little imagination into counterterrorism planning? Bruce Schneier says the science fiction writers may contribute to fresh thinking about security — but that an over-reliance on risk analysis and scenario brainstorming can be detrimental

We wrote a few weeks ago about a group of science fiction writers who organized to help DHS with ideas about future threats to the United States — and the best responses to such threats. It all started a couple of years ago, when DHS hired a few science fiction writers to come in for a day and think of ways terrorists could attack America. If the U.S. inability to prevent 9/11 marked a failure of imagination, as some said at the time, then who better than science fiction writers to inject a little imagination into counterterrorism planning?

Security maven Bruce Schneier discounted the exercise at the time, calling it “embarrassing.” “I never thought that 9/11 was a failure of imagination. I thought, and still think, that 9/11 was primarily a confluence of three things: the dual failure of centralized coordination and local control within the FBI, and some lucky breaks on the part of the attackers. More imagination leads to more movie-plot threats — which contributes to overall fear and overestimation of the risks,” he writes. “And that doesn’t help keep us safe at all.”

Schneier writes that he recently read a paper by Magne Jørgensen, titled “More Risk Analysis Can Lead to Increased Over-Optimism and Over-Confidence,” which provides some insight into why this is so. The paper is not about terrorism at all. It’s about software projects.

Most software development project plans are overly optimistic, and most planners are overconfident about their overoptimistic plans. Jørgensen studied how risk analysis affected this. He conducted four separate experiments on software engineers, and concluded (though there are lots of caveats in the paper, and more research needs to be done) that performing more risk analysis can make engineers more overoptimistic instead of more realistic.

Potential explanations all come from behavioral economics: cognitive biases that affect how we think and make decisions (Schneier has written about some of these biases and how they affect security decisions, and there is a great book on the topic as well).

First, there is a control bias. We tend to underestimate risks in situations where we are in control, and overestimate risks in situations when we are not in control. Driving versus flying is a common example. This bias becomes stronger with familiarity, involvement, and a desire to experience control, all of which increase with increased risk analysis. So the more risk analysis, the greater the control bias, and the greater