“We can be blind to the obvious, and we are also blind to our blindness” — Daniel Kahneman

Let’s say I have a coin which you watch me flip heads 99 times in a row. I tell you that the coin is fair. What is the probability that this coin flips heads on the 100th try?

Think about this: A fair coin randomly flipping heads 99 times in a row has a probability of less than 1 in an octillion. This means if 99 coins were flipped every second for all of eternity we would expect an average of one successful attempt at 99 simultaneous heads every ~1.4 trillion times the current age of the universe.

So, the question really requires you to evaluate whether it’s more likely for a fair coin to actually flip heads 99 times in a row or for the assumption of fairness to be flawed.

It’s easy to remain within the frame you’re given and answer with the first thing that comes to mind: 50%. But if you slow down and think about it, you’ll probably realize the true probability is closer to 100%.


The psychologist Daniel Kahneman explains the operation of our minds using a ‘dual system’ metaphor. One system is fast and frugal; it is our default mode called System 1. This is the intuitive operation of the mind which enables you to catch a baseball without having to compute differential equations. The other is slow and deliberate, what we (usually) would use when determining what baseball team to bet on. It’s called System 2.

So when confronted with the coin flip question, the fast System 1 provides the easily retrievable answer of “50%”. You think within the presented frame.

Biologically, it makes sense that System 1 is the default: the ancient man on the plateau who took a moment to think deeply about whether he should run from the approaching lion likely took his leave from the gene pool before passing along his contemplative disposition.

This is all well and good to maximize survival in the ancient world, the only problem is that most of the important things we make decisions about today — careers, finances, life plans — belong in the domain of System 2. Defaulting to System 1 in these domains creates the tendency to overlook important sources of complexity which can be dangerous since complexity often contains the sources of uncertainty which matter most.

For instance, the bell curve is called the “normal” distribution because it is frequently observed in nature. It fails, however, in complex systems. Consider that two randomly selected people whose combined height is 14 feet will most likely both be around 7 feet tall. It’s not only incredibly unlikely for two people to be something like 2 and 12 feet tall but also would not meaningfully disrupt the average human height if there were, so predictions based on the bell curve makes sense here. However, if you randomly selected two books whose combined readership is 1 million readers the most probable combination is 997,000 and 3,000.

Misunderstanding uncertainty in cases such as the latter can be dangerous because large deviations are not only more common than the bell curve would predict but also determine the properties of the whole. If Bill Gates walks into a bar with 10 people in it, the average per-person net worth of that bar is probably near $6-$10 billion, even if the other 10 people don’t own a cent. You can clearly see that averages don’t work here since wealth is not normally distributed; like many things in modern life, wealth is power-law distributed (see below).

Power laws (green) govern much of modern uncertainty — such as wealth distribution, book readership, war casualties, and network connectivity — and imply a frequency of high-impact rare events which the Normal distribution (blue) deems impossible. Note the positive difference between the green and blue lines after 18. Image: Power Laws in Venture

It is therefore hard to generalize from observations in these environments because, by definition, you’re statistically unlikely to contain the most important elements in any one sample. This type of randomness, termed “scalable” by mathematicians Nassim Taleb and Benoit Mandelbrot, underpins most of what matters in our lives.

“The world we live in is vastly different than the world we think we live in.” — Nassim Taleb

Our mental machinery, including all its heuristics and biases, usually serves us well. However, it evolved to maximize survival in a very different world and so it is critical to understand and account for the sometimes troublesome functioning.

Originally published at medium.com