I first heard about Black Swan in the context of a paper mentioned by Chetan Parikh (of CapitalIdeasOnline) in one of Book Club meetings. The author of the paper, Nassim Taleb, is a Lebanese-born financial trader and the author of a book, Fooled by Randomness. He is currently founder and chairman, Empirica LLC, a research laboratory and financial products trading house in New York, and Fellow in Mathematics in Finance and Adjunct Professor at the Courant Institute of New York University. The title of the paper is The Black Swan: Why Dont We Learn that We Dont Learn? [Malcolm Gladwell, the author of Tipping Point, has profiled Nassim in The New Yorker.]
Nassim explains a black swan event in an op-ed page article in the New York Times (April 8, 2004):
A black swan is an outlier, an event that lies beyond the realm of normal expectations. Most people expect all swans to be white because that’s what their experience tells them; a black swan is by definition a surprise. Nevertheless, people tend to concoct explanations for them after the fact, which makes them appear more predictable, and less random, than they are. Our minds are designed to retain, for efficient storage, past information that fits into a compressed narrative. This distortion, called the hindsight bias, prevents us from adequately learning from the past.
Black swans can have extreme effects: just a few explain almost everything, from the success of some ideas and religions to events in our personal lives. Moreover, their influence seems to have grown in the 20th century, while ordinary events the ones we study and discuss and learn about in history or from the news are becoming increasingly inconsequential.
Consider: How would an understanding of the world on June 27, 1914, have helped anyone guess what was to happen next? The rise of Hitler, the demise of the Soviet bloc, the spread of Islamic fundamentalism, the Internet bubble: not only were these events unpredictable, but anyone who correctly forecast any of them would have been deemed a lunatic (indeed, some were). This accusation of lunacy would have also applied to a correct prediction of the events of 9/11 a black swan of the vicious variety.
A vicious black swan has an additional elusive property: its very unexpectedness helps create the conditions for it to occur. Had a terrorist attack been a conceivable risk on Sept. 10, 2001, it would likely not have happened. Jet fighters would have been on alert to intercept hijacked planes, airplanes would have had locks on their cockpit doors, airports would have carefully checked all passenger luggage. None of that happened, of course, until after 9/11.
Nassim provides the wider context in a talk on Edge:
Consider two types of randomness. The first type is physical randomness in other words, the probability of running into a giant taller than seven, eight, or nine feet, which in the physical world is very low. The probability of running into someone 200 miles tall is definitely zero; because you have to have a mother of some size, there are physical limitations. The probability that a heat particle will go from here to China, or from here to the moon, is extremely small since it needs energy for that. These distributions tend to be “bell-shaped”, Gaussian, with tractable properties.
But in the random variables we observe today, like prices what I call Type-2 randomness, anything that’s informational the sky is the limit. It’ s “wild” uncertainty. As the Germans saw during the hyperinflation episode, a currency can go from one to a billion, instantly. You can name a number, nothing physical can stop it from getting there. What is worrisome is that nothing in the past statistics could have helped you guess the possibility of such hyperinflation effect. People can become very powerful overnight on a very small idea.
Take the Google phenomenon or the Microsoft effect “all-or-nothing” dynamics. The equivalent of Google, where someone just takes over everything, would have been impossible to witness in the Pleistocene. These are more and more prevalent in a world where the bulk of the random variables are socio-informational with low physical limitations. That type of randomness is close to impossible to model since a single observation of large impact, what I called a Black Swan, can destroy the entire inference.
Tomorrow: Nassim Taleb (continued)
TECH TALK Black Swans+T