Monday, March 21, 2011

Fukushima: Too soon for hindsight?

According to this tweet, it's too soon to use the crisis in Japan in a risk management article:
This is a fallacy. The opposite is true, it's probably too late to write a good risk management article.

Risk management is a lot like buying stocks. Hindsight is 20-20: it's easy to see which stocks you should've bought, but a lot harder to figure out which stocks will go up in the future. A lot of friends tell me "I know Apple's stock was going to go up", but of course, they didn't buy Apple stock, nor will they bet which stocks will go up in the future.

In much the same way, it's easy to look at risk in hindsight. Obviously, putting backup generators at ground level (and electrical connections in the basement) is a bad idea in a tsunami zone. But we'll fix that and move on -- the next major nuclear catastrophe will be caused by something else entirely. What's important to risk management is dealing with the next obvious-in-hindsight problem.

There were a lot of good risk management articles written by nuclear experts at the start of the crisis that accurately predicted what we are seeing now, a week later. They didn't predict the precise outcome -- they instead predicted a range of outcomes, and their likelihoods. Even if a less likely outcome happened, it doesn't make them wrong. It's like predicting the odds in Vegas. Even if your outcome is different than what the odds predict, it doesn't make the odds wrong.

Catastrophic events are horrible for risk management because of the way it gives know-nothings more credit than they deserve. The Three Mile Island incident stopped the building of nuclear power plants for more than 20 years. Anybody who predicted nuclear power was unsafe suddenly became the "expert", no matter how little they really knew. Anybody saying the opposite, that nuclear power was safe, wasn't taken as a credible source, no matter how much expertise they had. The sad thing about risk management that those with hindsight are respected more than those with foresight.

An expert in risk management isn't the person with the best record of predicting (the future, or the past through hindsight), but the person with the most comprehensive knowledge of all the uncertainties. Take the Three Mile Island incident as an example. Nobody died. No significant radiation leaked in the environment. Yet, most people believe that many died to leaked radiation. Even fewer people know basic facts about radiation, such as how a coal fired power plant puts more radiation into the atmosphere than a nuclear power plant. Risk managers know these facts.

The best risk management articles wouldn't be about what went wrong, but what went right. The containment vessels were designed 40 years ago to deal with the unthinkable, and when the unthinkable happened, they held. That's pretty impressive. Moreover, even if the Fukushima meltdown gets much worse, the amount of radiation put into the environment would still put nuclear power at a much safer and cleaner level than the oil industry. The point of risk management isn't stopping things that are risk, but evaluating the costs, too see if they are worth the benefits.

This hindsight fallacy is best explained by this South Park clip. Move forward to 1:10 (one minute ten seconds) into the clip to the spot where Captain Hindsight solves the BP oil spill:

http://www.southparkstudios.com/clips/360434/god-bless-you-captain-hindsight

I'm sure once Fukushima is finally under control in a few weeks there will be a rash of risk management articles telling us what we should've known two weeks ago.

1 comment:

Unknown said...

Actually Rob, it's way too early to do any sort of hindsight analysis.

http://newschoolsecurity.com/2011/03/actually-it-is-too-early-for-fukushima-hindsight/