Risk analysis“Black swans” and “perfect storms” are often lame excuses for bad risk management

Published 16 November 2012

The terms “black swan” and “perfect storm” have become part of public vocabulary for describing disasters ranging from the 2008 meltdown in the financial sector to the terrorist attacks of 9/11, but some argue that people in government and industry are using these terms too liberally in the aftermath of a disaster as an excuse for poor planning

The terms “black swan” and “perfect storm” have become part of public vocabulary for describing disasters ranging from the 2008 meltdown in the financial sector to the terrorist attacks of 9/11, but according to Elisabeth Paté-Cornell, a Stanford professor of management science and engineering, people in government and industry are using these terms too liberally in the aftermath of a disaster as an excuse for poor planning.

Her research, published in the November issue of the journal Risk Analysis, suggests that other fields could borrow risk analysis strategies from engineering to make better management decisions, even in the case of once-in-a-blue-moon events where statistics are scant, unreliable or non-existent.

A Stanford University release reports that Paté-Cornell argues that a true “black swan” — an event that is impossible to imagine because we’ve known nothing like it in the past — is extremely rare. The AIDS virus is one of very few examples. Usually, however, there are important clues and warning signs of emerging hazards (for example, a new flu virus) that can be monitored to guide quick risk management responses.

Similarly, she argues that the risk of a “perfect storm,” in which multiple forces join to create a disaster greater than the sum of its parts, can be assessed in a systematic way before the event because even though their conjunctions are rare, the events that compose them — and all their dependences — have been observed in the past.

Risk analysis is not about predicting anything before it happens, it’s just giving the probability of various scenarios,” she said. She argues that systematically exploring those scenarios can help companies and regulators make smarter decisions before an event in the face of uncertainty.

Think like an engineer
An engineering risk analyst thinks in terms of systems, their functional components and their dependencies, Paté-Cornell said. For instance, in many power plants that require cooling, generators, turbines, water pumps, safety valves and more, all contribute to making the system work. Therefore, the analyst must first understand the ways in which the system works as a whole to identify how it could fail. The same method applies to medical systems, financial or ecological systems.

Paté-Cornell stresses the importance of accounting for dependent events whose probabilities are intertwined, to create a complete list of scenarios — including the dependencies — that must be accounted for in the risk analysis. It is, therefore, essential that engineering risk analysis include external factors