Saturday, November 24, 2012

'Black swans' and 'perfect storms' become lame excuses for bad risk management

Kelly Servick in Stanford University News: Elisabeth Paté-Cornell argues that a true 'black swan' - an event that is impossible to imagine because we've known nothing like it in the past - is extremely rare. The terms "black swan" and "perfect storm" have become part of the public vocabulary for describing disasters ranging from the 2008 meltdown in the financial sector to the terrorist attacks of Sept. 11, 2001. But according to Elisabeth Paté-Cornell, a Stanford professor of management science and engineering, people in government and industry are using these terms too liberally in the aftermath of a disaster as an excuse for poor planning.

Her research, published in the November issue of the journal Risk Analysis, suggests that other fields could borrow risk analysis strategies from engineering to make better management decisions, even in the case of once-in-a-blue-moon events where statistics are scant, unreliable or non-existent.

Paté-Cornell argues that a true "black swan" – an event that is impossible to imagine because we've known nothing like it in the past – is extremely rare. The AIDS virus is one of very few examples. Usually, there are important clues and warning signs of emerging hazards (e.g., a new flu virus) that can be monitored to guide quick risk management responses.

..."Risk analysis is not about predicting anything before it happens, it's just giving the probability of various scenarios," she said. She argues that systematically exploring those scenarios can help companies and regulators make smarter decisions before an event in the face of uncertainty.

An engineering risk analyst thinks in terms of systems, their functional components and their dependencies, Paté-Cornell said. ....Paté-Cornell says that a systematic approach is also relevant to human aspects of risk analysis.

....Paté-Cornell has found that human errors, far from being unpredictable, are often rooted in the way an organization is managed. "We look at how the management has trained, informed and given incentives to people to do what they do and assign risk based on those assessments," she said.

...."Lots of people don't like probability because they don't understand it," she said, "and they think if they don't have hard statistics, they cannot do a risk analysis." In fact, we generally do a system-based risk analysis because we do not have reliable statistics about the performance of the whole system.

Two black swans, shot by fir0002 | flagstaffotos.com.au, Wikimedia Commons, under the following Creative Commons license: Attribution NonCommercial Unported 3.0

No comments: