## Sunday, March 20, 2011

### Calculating calamity: Japan's nuclear accident and the "antifragile" alternative

Famed student of risk and probability and author of The Black Swan Nassim Nicholas Taleb tells us that in 2003 Japan's nuclear safety agency set as a goal that fatalities resulting from radiation exposure to civilians living near any nuclear installation in Japan should be no more than one every million years. Eight years after that goal was adopted, it looks like it will be exceeded and perhaps by quite a bit, especially now that radiation is showing up in food and water near the stricken Fukushima Dai-ichi plant. (Keep in mind that "fatalities" refers not just to immediate deaths but also to excess cancer deaths due to radiation exposure which can take years and even decades to show up.)

Taleb writes that it is irresponsible to ask people to rely on the calculation of small probabilities for man-made systems since these probabilities are almost impossible to calculate with any accuracy. (To read his reasoning, see entry 142 on the notebook section of his website entitled "Time to understand a few facts about small probabilities [criminal stupidity of statistical science].") Natural systems that have operated for eons may more easily lend themselves to the calculation of such probabilities. But man-made systems have a relatively short history to draw from, especially the nuclear infrastructure which is no more than 60 years old. Calculations for man-made systems that result in incidents occurring every million years should be dismissed on their face as useless.

Furthermore, he notes, models used to calculate such risk tend to underestimate small probabilities. What's worse, the consequences are almost always wildly underestimated as well. Beyond this, if people are told that a harmful event has a small chance of happening, say, 1 in a 1,000, they tend to dismiss it, even if that event might have severe consequences. This is because they don't understand that risk is the product of probability times severity.

If the worst that walking across your room could do is cause a bruise from falling, you wouldn't think much about it. Even if the chance of getting a bruise were significant, you'd probably be careful and figure it's worth the risk. But if walking across your room subjected you to the possibility of losing your arm, you might contemplate your next move a bit more.

But, the point Taleb makes is that the people of Japan did not know they were subjecting themselves to this severe a risk. If they had, they might have prepared for it or they might have even rejected nuclear power altogether in favor of other energy sources. But, both the probability and severity of this event were outside the models the regulatory agencies used. This is one of the major reasons we often underestimate risk and severity. But even if such an event had been included, the consequences would most likely have been considerably underestimated.

It is the nature of complex societies to continually underestimate risks. What we tend to do is to assign a probability to a possible harmful event and think that by assigning that probability we have understood the event and its consequences. It is a kind of statistical incantation that is no more useful than shouting at the rain. But because it comes wrapped inside a pseudo-scientific package, we are induced to believe it. If important men and women with PhDs have calculated the numbers, they must be reliable, right?

When it comes to calculating the extremes of physical attributes such as the height or weight of human beings, we have a large number of cases and we have the limits of biology and physics to guide us. No human can be 100 feet tall or weigh 10,000 pounds. But when it comes to social phenomena, we are often lost. Human-built systems produce unpredictable outcomes precisely because humans are so unpredictable. They have behavior patterns, but those patterns can't be described in equations. In our world, millions and even billions of people are making decisions which affect markets, technology and society every day, and no one is capable of observing and calculating the effects of such decisions. This makes any resulting patterns difficult if not impossible to ascertain. And, when we try to gauge the effect of actual and possible natural phenomena on human-built systems and vice versa with the precision of several decimal places, we are only fooling ourselves.

So what should we do? Normally, we say we should try to make our systems more robust, that is, harder to destroy or cripple under extreme conditions. This seems altogether reasonable. But what if there is another choice? What if it is possible to build systems that thrive when subjected to large variations? Taleb points to such a possibility in an article entitled "Antifragility or The Property Of Disorder-Loving Systems." The text is difficult unless you've read his other work extensively. But look at the chart, and you will begin to get an idea of what he means by antifragility.

The relocalization movement should take note that as serious a thinker as Taleb has characterized a decentralized, artisan-based culture as one that is antifragile. It might be useful to figure out how to explain this advantage to interested audiences who are watching the complex systems of modern society crumble around them.

#### 3 comments:

Anonymous said...

As an anesthesiologist the concept of risk and benefit is always on my mind. There are certain techniques that have a low probability of adverse events, however the associated adverse event is probable death. There are other techniques that may be safer but take more time. It has always amazed me that some of my colleagues even when the adverse event happens cannot see that risk is probability x severity and thus discontinue to use said technique.

Anonymous said...

Believe me the professional statisticians (if they conscientiously do their job) are very good at calculating the probabilities and risks of certain events occurring. The problem is the range of possible events that could occur and the resulting interactions - obviously only a limited subset is evaluated, thus any risk analysis is inherently incomplete.

Take your walking across the room for example - perhaps the worst that could happen is that you trip and fall and strike your head on the corner of your bed. If you're a heavy person this could actually cause quite a bit of damage.

Mark Goldes said...

Alternatives to new and later existing nuclear plants are being born.

An inexpensive, green, Low Energy Nuclear Reactor (LENR) is now in production.

It is inherently much safer than existing nukes and uses non-radioactive Nickel, not radioactive Uranium, as fuel.

Power cost is projected at one penny per kilowatt hour.

No nuclear waste is produced.

See Cold Fusion at www.aesopinstitute.org to learn more.

A one Megawatt heating plant is scheduled to open in Greece, in October.

A nuclear scientist has said when these compact modular units, which can be linked like solar panels to produce any desired power level, begin producing inexpensive electricity it will start a "stampede".

Competitive designs are being developed. Early regulatory approval may prove possible.

These developments could cost-competitively undercut any need for new Uranium fueled nuclear plant production.

And LENR designs have no possible chance of a meltdown!

They can be a building block for decentralized energy generation 24/7.
To paraphrase Taleb, big is fragile and ugly. Small is still beautiful.