Book Review: Thinking, Fast and Slow by Daniel Kahneman


Since childhood time we learn how to deal with our emotions since they often are wrong and take control of ourselves. On the other hand, we are taught to use our rational mind. In school, we learn about philosophers emphasizing the power of the rational mind over the emotional brain. Indeed, the rational mind is what it makes us human. It is our rational mind that allowed us to build cities, made us build empires and ultimately made us aware of the infinite beauty of the universe. But is this rational mind so infallible?

Modern Psychology

The Fathers of contemporary Psychology, Daniel Kahneman showed in his book Thinking, Fast and Slow the research conducted (for decades) with his colleague Amos Tversky on how our mind deals with probability and statistics. Their studies unraveled the way our so-called “rational mind” works. In other words, according to Kahneman, we have two Systems of thinking, System 1 and System 2. The former is effortless, automatic and intuitive, while the latter is slower, more aware and effortful. The problem lies in the fact that System 2 is lazy and does not intervene often.

Heuristic and Biases

What happens with our cognitive machine is that it often relies on shortcuts. In other words, our brain has evolved to find the solutions swiftly rather than accurately. This implies that our mind often when performing such shortcuts (so-called Heuristics) it also leads us toward biases (errors). Those biases make us take (wrong) decisions ignoring basin principle of probability and statistics. In other words, it seems like our brain is not wired to think in statistical terms naturally.

How we make sense of the world

One good example of how we ignore basic statistics principles altogether is when you are asked whether you would be willing to pay more to ensure yourself against anything or just against terrorism. Now the answer may seem predictable, put in this way. But if before posing the question, you watched the news and a terroristic attack killed hundreds of people (on the other side of the globe). The chances are that you would be willing to pay more for the insurance against terrorism rather than the general insurance. Nonetheless, the general insurance also includes the coverage against terroristic attacks.

Knowledge does not make us more knowledgeable

The paradoxes that come out from Thinking Fast and Slow are many, although few very striking. For instance, Kahneman affirms that also if he is aware of all those biases, this did not prevent him to fall into them. In other words, knowledge sometimes doesn’t help. Also, often information also prevents us from understanding the world as it is. For instance, a historian, who studied his entire life, may be convinced to know why certain things happened in the past, and therefore make also a prediction about the future. But it often happens the opposite scenario. In short, the historian ends up falling in many (expert) traps, where his knowledge convinces him to understand what he does not. In fact, we are pattern-seeking machines, which in continuous need of our daily dose of patterns tend to see connections where they don’t exist. The other paradox is that the more you are an expert if specific fields (history and finance may be a good example) the more you fall into those traps.

Do we stand a chance?

Not sure how Kahneman would answer this question (although odds are he would be pessimist about it). Nonetheless, I do believe that by knowing our biases we can at least try to set up a controlled environment as much as possible. In short, we should work on building a system rather than try to change our nature altogether. The system will work for us and will steer us toward the desired outcome, instead of solely relying on our brains that try their best to keep up with the modern environment we created.

Screen Shot 2016-01-09 at 4.24.34 PM


Published by

Gennaro Cuofano

Creator of | Head of Business Development at | International MBA

Leave a Reply