“Black swans” and “perfect storms” are often lame excuses for bad risk management

that can affect the whole system, Paté-Cornell said.

In the case of a nuclear plant, the seismic activity or the potential for tsunamis in the area must be part of the equation, particularly if local earthquakes have historically led to tidal waves and destructive flooding. Paté-Cornell explained that the designers of the Fukushima Daiichi Nuclear Power Plant ignored important historical precedents, including two earthquakes in 869 and 1611 that generated waves similar to those witnessed in March of 2011.

What some described as a “perfect storm” of compounding mishaps Paté-Cornell sees as failure to assess basic failure probabilities based on experience and elementary logic.

A versatile framework
Engineering risk analyses can get complex, but their components are concrete objects whose mechanisms are usually well understood. Paté-Cornell says that this systematic approach is relevant to human aspects of risk analysis.

Some argue that in engineering you have hard data about hard systems and hard architectures, but as soon as you involve human beings, you cannot apply the same methods due to the uncertainties of human error. I do not believe this is true,” she said.

In fact, Paté-Cornell and her colleagues have long incorporated “soft” elements into their systems analysis to calculate the probability of human error. They look at all the people with access to the system, and factor in any available information about past behaviors, training and skills.

Paté-Cornell has found that human errors, far from being unpredictable, are often rooted in the way an organization is managed.

We look at how the management has trained, informed, and given incentives to people to do what they do and assign risk based on those assessments,” she said.

Paté-Cornell has successfully applied this approach to the field of finance, estimating the probability that an insurance company would fail given its age and its size. She said the companies contacted her and funded the research because they needed forward-looking models that their financial analysts generally did not provide.

Traditional financial analysis, she said, is based on evaluating existing statistical data about past events. In her view, analysts can better anticipate market failures — like the financial crisis that began in 2008 - by recognizing precursors and warning signs, and factoring them into a systemic probabilistic analysis.

Medical specialists must also make decisions in the face of limited statistical data, and Paté-Cornell says the same approach is useful for calculating patient risk. She used systems analysis to assess data about anesthesia accidents – a case in which human mistakes can create an accident chain that, if not recognized quickly, puts the patient’s life in danger. Based on her result, she suggested retraining and recertification procedures for anesthesiologists to make their system safer.

Professor Paté-Cornell believes that the financial and medical sectors are just two of many fields that might benefit from systems analysis in uncertain, dynamic situations. “Lots of people don’t like probability because they don’t understand it,” she said, “and they think if they don’t have hard statistics, they cannot do a risk analysis. In fact, we generally do a system-based risk analysis because we do not have reliable statistics about the performance of the whole system.”

She hopes that her probabilistic approach can replace the notions of black swans and perfect storms, making the public safer and better informed about risks. Apparently, others have this same hope.

It must have struck a chord,” she said, “because I already get lots of comments, responses and ideas on the subject from people around the world.”