Over the weekend, I went for a walk and listened to Nassim Nicholas Taleb’s talk at the Long Now (viewable at the Whole Earth site, and summarized here). I’ve been doing this for a few weekends now – I can never pay enough attention to listen to a talk like that if I’m at home because I get distracted, but going for a nice long hour-and-a-half walk is a good way to burn off some energy and get educated at the same time. I recommend the Long Now podcast if you’re looking for good talks from interesting intellectuals.

Taleb recently published The Black Swan, a follow-up to his original book, Fooled by Randomness, and uses the talk to discuss some of the ideas from that book. I won’t try to summarize the whole talk but he made two key points that I want to record for future reference. Both points derive from how our intuitions and our mental tools are not equipped to handle nonlinear models. This may seem like an abstruse topic but had very real consequences in the subprime meltdown, when investors theories’ did not take into account non-linear exponential failures of their models.

Taleb posits two worlds: Mediocrestan and Extremistan. He describes Mediocrestan by having the audience imagine a group of 100 people and their distribution of weights. Then he says to determine how the average weight of the group would change if we added the heaviest person in the world to that group. It turns out to not affect the average that much – even if we add a 1,000 pound person, it shifts the average by only 0.5% or so. This is the world of the normal Gaussian distribution that we understand very well with standard deviations and the like.

Now do the same thought experiment, but use people’s wealth instead. Imagine a group of 100 typical people, and their average wealth. Now add Bill Gates to the group. At this point, 100 of the 101 people in the group are below average in wealth, and Bill Gates has approximately 100% of the wealth of the group. This is the world of Extremistan, where outliers can blow up the normal distribution. This is the world of the Black Swan.

And what’s interesting is that we are so bad at dealing with Extremistan. We just don’t intuitively get it, even though we are surrounded by examples of it. Finance and wealth. Book publishing (a significant portion of all book sales are Harry Potter books). The music industry. eBay. We live in an Extremistan world, but our intuition (evolved in a simpler time without network effects) is still stuck in Mediocrestan. So we have to beware of our instincts, because they will get the wrong answers. And we have to beware of charlatans using Mediocrestan theories because they are calculable – it’s like physicists treating everything as a simple harmonic oscillator because that’s the only equation they can solve.

Another example of Extremistan comes from a completely different source. I’m currently reading Poor Charlie’s Almanack, a book of the wisdom of Charlie Munger, Warren Buffett’s investment partner. Munger notes: “If you look at Berkshire Hathaway and all of its accumulated billions, the top ten insights account for most of it.” He also quotes Buffett as saying:

“I could improve your ultimate financial welfare by giving you a ticket with only twenty slots in it so that you had twenty punches – representing all the investments that you got to make in a lifetime. And once you’d punched through the card, you couldn’t make any more investments at all. Under those rules, you’d really think carefully about what you did, and you’d be forced to load up on what you’d really thought about.”

The typical investment strategy is diversification – invest in lots of things and trust in the average, which would work in Mediocrestan. Buffett and Munger have internalized the idea of Extremistan in investing and exploited it to their advantage by realizing that there will be successes wildly out of proportion to the norm and targeting only those investments.

The other illustration of nonlinearity that Taleb used was to imagine an ice cube melting into a small puddle. Now imagine starting with the puddle and trying to reconstruct what the ice cube looked like. You can get the volume of the ice cube, but you can not derive the shape of the ice cube because there are an infinite number of shapes that could have melted and left that puddle. In other words, there is not sufficient information in the final state to determine the initial state; information is lost in this process. He uses this observation to illustrate why he doesn’t trust theories; because the observable world does not constrain theories enough, many theories can fit existing data without providing predictive power.

This multiplicity could also be illustrated by taking the sequence: 1, 2, 3. What’s the next number? Most of us would answer 4. But the answer could be anything from 0.1 to 100,000. I can construct an equation that would give any answer you chose as the fourth entry in that sequence. There are an infinite number of possibilities that fit the available data. Taleb reminds us of this multiplicity and displays extreme skepticism when decisions are made based on believing just one possible theory.

Taleb’s ice cube reminds me of a discussion from Zen and the Art of Motorcycle Maintenance. Pirsig quotes Poincare as saying “If a phenomenon admits of a complete mechanical explanation it will admit of an infinity of others which will account equally well for all the peculiarities disclosed by experiment.” This is the dirty secret of science – theories are worth nothing, because an infinite number of theories can explain any experimental result, including such outlandish ones as the Flying Spaghetti Monster. Popper’s claim that theories must be falsifiable to be scientific is a consequence of this – every theory is always one experimental observation away from being disproven. Scientists live in a world where they are sifting through an infinity of possible theories, trying to choose one that best fits their observations, but knowing that their theories can never be proven true, only proven false.

I don’t really have any deep analysis here. I liked the visual imagery Taleb used to illustrate his points, and wanted to record that in this post. After listening to his talk, I may have to get *The Black Swan* from the library this summer to see if the rest of the book is of similar quality.

The percentage on that first example is off by an order of magnitude — if you add a 1,000 pound person to a hundred normally-distributed people, the average shifts by 10 pounds, which is about 5 or 6 percent of the original average.

But yeah, long-tailed power-law / pareto / log-normal distributions are not as common as gaussian distributions, but they are still quite common in nature. You get normal distributions whenever you average together a bunch of random processes. You get logarithmic distributions whenever you have a positive feedback loop that causes bigger things to grow faster.

In many ways, the internet is all about the long tail (Hello Amazon! Hello netflix!), which makes me wonder if we’ll start to get a better feel for these kinds of distributions as a populace now that we’re exposed to them more commonly. (Of course, that’s probably limited by the degree to which people take notice of random distributions in the first place, which is not very much because the vast majority of people are depressingly innumerate

andwe have to fight a bunch of innate biases…)One reason for the populatity of the Normal distribution is, as I’m sure Nicholas Taleb knows, that the resulting mathematics is tractable — ie, not impossible to work through. That is not true of the common distributions with fatter tails. I think this, rather than any pseudo-evolutionary explanation about our fitness for mediocrestans, is the reason for the popularity of Normal distributions in science and in finance.

For the last 350 years, mainstream western culture has been entranced by the formal over the informal, by text over image, by the written over the spoken, by the universal over the the particular, by the permanent over the ephemeral, by theory over practice. Perhaps the worst example of this systematic bias is mathematical economics, to which our modern world owes so many of its ills.