The following is adapted from Surfing Rogue Waves.

Did you know that your brain is a supercomplex system of billions upon billions of nerves that communicate with the rest of your body within fractions of seconds? This system can arrange patterns that systemize your emotions, behaviors, and thoughts, both consciously and unconsciously. 

Our brains necessitate higher sensory perception functions, motor commands, spatial reasoning, and language—all uniquely human things. Do we fully understand, structurally, how billions of loosely coupled neurons, firing around on synapses, give rise to these incredible functions? In some cases, yes. In others, not so much. 

Our brains allow our intelligence, and this intelligence created our scientific mind. Science is proven through reliable and repeatable experiments. Through science, we now know how modern-day humans came to be. Evolution and our brains’ development help us understand how humans developed into the dominant race on Earth. 

Why should you care about any of this? Is anything about this important? Or are these just some random facts—interesting, perhaps, but not particularly relevant to your life? 

Turns out, there is relevance: when we understand these concepts, and understand how and why our brains are more than just intelligent machines, we can look inward to find and correct the things that don’t work, while enhancing what does. In other words, understanding how our brains work—and what its shortcomings are—lets us be more rational and better understand the truth. And that, I think you’ll agree, is something our world desperately needs right now. 

More Information Isn’t Always Better

Humans make mistakes. One-off mistakes are usually due to insufficient data or knowledge needed to make a decision—it happens, but these aren’t the kinds of mistakes we need to worry over. After all, these are the types of mistakes where we can gather new information and make better decisions and fewer errors in the future. 

For example, imagine you were washing freshly picked apples from a barrel. You reach into the barrel and pull out an apple because that’s all you think is in the barrel. You could believe that there are only apples in the barrel. 

But as you pull out more items, you find that there are also potatoes in the barrel. With this new information, it would be rational for you to no longer conclude that there were only apples in the barrel. 

But you must be careful. There are traps! There are situations where the information you get is gained from biased methods. In those instances, more information could make your prediction worse. 

Beware of Bias

Let’s look at how bias might lead to inaccuracies. Imagine once again using our example of the apple and potato barrel. Instead of a small barrel, we have a large deep barrel. 

Now imagine that the huge barrel is filled with water to soak 100 apples and/or potatoes. Now picture that you were asked to reach in above your head, so you cannot look into the barrel, and pull them out one at a time and guess the makeup of apples versus potatoes left inside. 

At first, you would pull out an apple, but being aware that more information is more accurate, you proceed to pull out another 40 apples in a row. After pulling out so many apples and not a single potato, you would likely conclude that the barrel is made up entirely of apples. 

However, what you fail to understand is that apples float, and potatoes sink. There could very well be many more potatoes than apples in the barrel. 

This is a systemic error in which the sample is significantly underrepresented in a specific direction; your bias led you to draw the wrong conclusion. The key is to keep the potential for bias in mind to prevent yourself from making avoidable mistakes when assessing new information. 

We All Have Cognitive Biases

Humans are fallible to far more than just data-based traps; others come from the very architecture of our brain. Human minds are famous for having cognitive biases. Unlike random, one-off errors from having limited information or misusing data, cognitive biases are functional errors in how we think and draw us further away from the truth. Worse yet, we learn our cognitive biases; it is not the result of our evolution. 

Throughout our evolution, understanding and knowing truth has never been essential to our survival. Historically, as individuals, it was more important to believe what others believed so we could get along in our communities, regardless of what may or may not have been true. 

However, things have changed. Understanding cognitive biases is essential, especially at the speed our world is moving and changing, to determining where we might end up. Biases may have mattered less in the snail-paced, linear world our ancestors lived in, but accurately mapping our actions and understanding of today’s real world are now more important than ever. 

Luckily for us—because they are far more than just intelligent machines—our brains have the ability to understand and become aware of our own cognitive biases. How? Because we have the ability to think rationally.

Rationality is Key

Rationality is about forming and using our beliefs to make decisions that most accurately reflect what will happen in the real world. It’s a way to increase our chances of being correct, without necessarily always being correct.

It is best to look at rationality in two concepts: epistemic rationality and instrumental rationality. Epistemic rationality is when we work to improve the accuracy of our beliefs systematically. The art of believing informs our new beliefs based on new evidence. Think of this as upgrading how our beliefs can most closely match the truth—it’s an attempt to map our beliefs to the realities of the world. 

Instrumental rationality is when we systematically achieve our values. This is the art of choosing actions that lead to outcomes and then ranking these actions higher in our preferences. 

Rationality Must Define Our Decisions 

Here’s where we bring it out of the realm of the theoretical and back to how this affects how we all live our lives. Our rationality defines the decisions we make and the actions we take. Living in a world of constant flux, we no longer have the luxury of a fixed, controlled environment to analyze our options before acting. 

The way we address our challenges today is vastly different from how previous generations tackled them. Today we are in a constant state of improvisation as we attempt to keep up with the onslaught of disruptions and changes that persistently emerge in life. 

The increasingly interconnected and interdependent world is driving such rapid change that we stand little chance of surviving without considering how complexity impacts us. We must understand that our brains are far more than intelligent machines, that they are capable of identifying and overcoming our cognitive biases. We must learn to think rationally if we want to make optimal, everyday decisions. It truly is the only way to survive and, more importantly, thrive.

For more advice on how to implement a framework to overcome cognitive biases and make the best decisions in any situation, you can find Surfing Rogue Waves on Amazon.

Eric Pilon-Bignell is a pragmatic futurist focused on addressing disruption by increasing the creative capacity of individuals, teams, and organizations to ignite change, innovation, and foster continuous growth. Eric has an undergraduate degree in engineering, an MBA in Information Systems, and a Ph.D. in Global Leadership. His doctoral work primarily explored complexity sciences centered on executive cognition and their use of intuitive improvisation, decision-making, artificial intelligence, and data-based decision models. When he is not working with clients, researching, or writing, he can be found in the mountains or on the water. He founded PROJECT7 to raise awareness and money for research on brain-related illnesses. Eric is currently working and living with his wife in Chicago, Illinois. To connect or learn more about this book, Eric, or PROJECT7, please visit