The concept of cognitive biases was introduced and popularized by the work of Amos Tversky and Daniel Kahneman since 1972. Biases are seen as systematic errors and flaws that make humans deviate from the standards of rationality, thus making us inept at making good decisions under uncertainty.
That is the conventional definition, let’s see what’s wrong about it and why we want to start from an alternative definition.
Why we got it all wrong about biases?
In my previous article about heuristics, we saw why heuristics can be powerful thinking tools for business people dealing with uncertainty on a daily basis.
When I define heuristics though I’m not using the conventional definition (to be sure that is not the definition given by Kahneman) and we’ll see why that same definition is biased in the first place.
That makes us reconsider the whole thinking model business people, entrepreneurs, managers and all the practitioners out there have been influenced by (count me in).
That is why I decided to analyze the few flaws of the conventional way to look at biases and cognitive errors.
Let’s recap here some of the key points of what heuristics are really about and why those make sense for business people.
Then we’ll go through the core mistakes of the conventional view of biases and cognitive errors at the foundation of behavioral economics and much more.
Then we’ll ask the fundamental question: what’s next?
Context matters
When dealing with real-life scenarios we can relate to them based on the context we live. A Halloween custom wore during a casual Friday won’t look as odd as the same custom wore on a regular day.
Humans think in narrow contexts not because they are narrow-minded but primarily due to the fact that often a successful decision is based on surviving a specific situation.
At the same time, our minds are capable of understanding at a deep level (not logical, neither explainable) the subtleties of the real world, made of hidden costs, risks, and high uncertainty.
In this scenario, things that might seem irrational are not such if looked from a different perspective.
A classic example that gets cited often is about how humans are “loss averse” thus giving much more weight to a loss of $10 say compared to the same $10 gain.
For the modern psychologist, marketer or businessman that might seem irrational and a signal of the human mind’s limitations and stupidity.
However, in real-world scenarios, things are never so clean and clear. Often the problem is hidden, so hidden that being loss averse is just a natural, time-tested defense mechanism against possible screw-ups.
The whole Warren Buffett’s playbook can be summarized: “Rule No. 1: Never lose money. Rule No. 2: Don’t forget rule No. 1”.
The most amateur stock trader knows that losing money is way worse than gaining. If you start from a $100 investment and you lose 50% you end up with $50.
However, to go back to where you were, $100, you will need to gain 100%. In short, a 50% loss will call for a 100% to get back to the initial point.
Polymath Jared Diamond, in his book, The World Until Yesterday, talks about constructive paranoia.
He learned this concept when leaving with several tribes in New Guinea. For instance, those tribes had a cultural norm to avoid sleeping under big trees due to a seemingly irrational fear those might fall.
Indeed, there is a very low probability of that happening. However, if it does there is no way back, you’re dead.
In most real-life scenarios those potential losses carry hidden risks, which as they can’t be computed, are ignored by psychologists, but instead are not hidden to the human mind.
So better be paranoid than a dead smart person. Tribesmen know better while some modern psychologists have forgotten.
What if risk aversion is just a constructive paranoia? This is one of the many examples of how biases could be easily reframed.
A narrow definition of rationality
Modern psychologists have primarily looked at one side of rationality and assumed that’s all that is. This led to the mainstream acceptance of a distorted theory of mind, which focuses on the cognitive errors humans make devoid of any context which has led to an endless list of biases which, we stupid humans fall into.
While it is admirable to move from a psychological framework where humans are infallible to understanding and studying the flaws of our minds.
It is as bad to fall for the opposite thinking model, where the human mind is seen as just an artifact of an ancient time, which only carries errors because it can’t deal anymore with the modern world.
That is why in the last years one of the most used mantras in business, marketing, sales or any endeavor that deals with human behavior has been about “biases and cognitive fallacies” yet as we’ll see those fallacies are mostly rationality in the real world, applied contextually.
The fundamental Kanheman’s error
Scholars like Kahneman and Tversky have changed the way we think about how we think.
In the book “Thinking, Fast and Slow” Kahneman explains his whole career spent in understanding how humans deal with decision-making, especially in relation to uncertainty and whether humans are good “intuitive statisticians.”
As Kahneman’s work would show, people are not good intuitive statisticians, and a two-model thinking system drove our decision-making in the real world.
From these assumptions heuristics produced biases, and those biases, in turn, were systematic errors that made us irrational.
Later on, Kahneman would draw a more balanced view for which judgment and choices aren’t just based on heuristics but also on skills.
Thus, biases would also be the result of the expert overconfidence, or the fact that the more skills you acquire in certain fields the more you become confident about them, thus fall into cognitive biases.
Kahneman’s work has led to infinite lists of human irrationality, humans’ complete inadequacy in having a clear picture of the real world and our inability to deal with logic.
From psychology, straight into economics, decision-making and any other endeavor related to human behavior (marketing, sales, entrepreneurship and more) these have become the dominant thinking models.
Yet this view is extremely narrow and it leads to the opposite excess. Psychologists and practitioners become producers of an infinite list of biases that grow every day to show how irrational we are.
While this production has some literary value, it doesn’t carry any value for the business person trying to make things work in the real world. If at all, that view can be limiting and damaging.
Redefining biases
Some of the fundamental errors are the following:
- Out of context: the problem of the currently dominant theories around biases is the focus on the behavioral aspects (how we say we would act in a certain hypothetical scenario or how we act in completely noncontextual scenarios) vs. how we really act in a specific real context.
- What is rationality, really? If we define rationality as the ability to follow logical rules, then we are all irrational. If we redefine rationality as the ability to survive specific context-driven situations, then something like risk aversion can be reframed as constructive paranoia. Therefore something that we used to see as a cognitive error, becomes a defense/survival mechanism given the asymmetry of risk-taking and the fact that certain hidden risks can’t be known, or can be known only in hindsight.
- Do skills really create biases? Another limited view is the fact that skills cause biases. I think the problem is not of skills but whether in certain domains skills can be acquired at all. In certain areas, think of sports, the more you train, and you do it in a deliberate way, the better you become. In other areas, like entrepreneurship and business in general, building skills is trickier. Each situation and scenario will have its own subtleties and experience (not skils) make us act in certain ways that we can’t even explain. Yet can we call that a skill?
- Are biases really biases? By following what’s above you can understand that biases aren’t so if looked through the lenses of a different definition of rationality.
If you agree with all the points above, does it still make sense to keep using this thinking model?
What’s next? Beyond the “bias bias” and into the real-world decision making
Gerd Gigerenzer, in “The Bias Bias in Behavioral Economics” explains how Kahneman’s work has led to the tendency to “spot biases even when there are none.”
As Gigerenzer explained people ” have largely fine-tuned intuitions about chance, frequency, and framing.”
Thus showing little evidence about the fact that biases lead to any cost at all. Therefore, each time you see a bias proposed by psychologists you might want to keep a skeptical eye and trust your fine-tuned intuition, and acquired experience as a business person!
References:
- The Bias Bias in Behavioral Economics, Review of Behavioral Economics, 2018, Gerd Gigerenzer
- Heuristic Decision Making, Gerd Gigerenzer and Wolfgang Gaissmaier, Annu. Rev. Psychol. 2011. 62:451–82
- Judgment Under Uncertainty, Heuristics, and Biases, Amos Tversky and Daniel Kahneman
- Thinking, Fast and Slow, by Daniel Kahneman
- Risk Savvy: How to Make Good DecisionsBook by Gerd Gigerenzer
Key Highlights
- Introduction of Cognitive Biases: Cognitive biases were introduced by the work of Amos Tversky and Daniel Kahneman in 1972. They refer to systematic errors that lead humans to deviate from rational decision-making under uncertainty.
- Reframing Biases as Heuristics: Gerd Gigerenzer emphasizes that the term “heuristic” is rooted in Greek and signifies a fast and accurate way to make decisions in uncertain real-world situations. Biases, rather than being errors, are heuristics that guide decision-making contextually.
- Contextual Decision-Making: Humans tend to think in narrow contexts because successful decisions often relate to specific situations. This is a natural survival strategy where hidden costs, risks, and uncertainty play significant roles.
- Loss Aversion and Constructive Paranoia: Loss aversion, often seen as irrational, can be reframed as a natural defense mechanism against hidden risks and potential screw-ups. Contextually, the aversion to losses can be seen as constructive paranoia, a defense mechanism against uncertain outcomes.
- Narrow Definition of Rationality: Modern psychology focused on cognitive errors while ignoring the context, leading to a distorted view of rationality. Both extremes, viewing humans as infallible or inherently flawed, hinder understanding of decision-making in the real world.
- Kahneman’s Approach: Kahneman’s work led to the identification of biases and systematic errors, presenting humans as irrational. While insightful, this approach disregards contextual skills and judgment.
- Redefining Biases: Biases can be redefined as contextually driven heuristics that aid survival and decision-making in specific scenarios. This reframing challenges the traditional biases perspective.
- Skills and Biases: Not all skills lead to biases. Skills in different domains have varying impacts on decision-making. In areas like entrepreneurship, experience shapes behavior more than skills.
- Are Biases Really Biases?: Through the lens of reframing, biases cease to be biases in the traditional sense. They become adaptive responses to context and uncertainty.
- Beyond the “Bias Bias”: The tendency to spot biases even where they may not exist has led to a “bias bias.” Fine-tuned intuitions and experience play a crucial role in decision-making, especially for business practitioners.
- Next Steps: Gigerenzer’s work questions the prevalent thinking model of biases and cognitive fallacies. Instead, he advocates for understanding heuristics within real-world decision-making contexts.
Bias | Description | Implication |
---|---|---|
Anchoring Bias | The tendency to rely too heavily on the first piece of information encountered when making decisions. | Can lead to decisions being influenced disproportionately by initial information, even if it’s irrelevant or misleading. |
Confirmation Bias | The tendency to search for, interpret, favor, and recall information in a way that confirms one’s preexisting beliefs or hypotheses. | Can result in overlooking contradictory evidence, leading to flawed decision-making and reinforcing existing biases. |
Availability Heuristic | Estimating the likelihood of events based on their availability in memory; the more readily available something is, the more likely it is judged to be true. | May lead to overestimating the probability of vivid or recent events, while underestimating the probability of less memorable events. |
Bandwagon Effect | The tendency to do or believe things because many other people do or believe the same. | Can lead to conformity without critical evaluation, potentially resulting in irrational decisions or actions. |
Base Rate Fallacy | Ignoring statistical information about general principles in favor of singular cases or specific details. | Can result in inaccurate judgments and decisions by failing to consider relevant background information or probabilities. |
Belief Bias | Evaluating the strength of an argument based on the believability of its conclusion, rather than its validity. | Can lead to accepting weak arguments if the conclusion aligns with one’s beliefs, regardless of the logic or evidence presented. |
Choice-Supportive Bias | The tendency to remember one’s choices as better than they actually were. | May lead to overestimating the value of past decisions, potentially influencing future decision-making inappropriately. |
Clustering Illusion | The tendency to see patterns or significance in random or meaningless data. | Can result in seeing trends or connections where none exist, leading to faulty interpretations and decisions. |
Confirmation Bias | The tendency to search for, interpret, favor, and recall information in a way that confirms one’s preexisting beliefs or hypotheses. | Can result in overlooking contradictory evidence, leading to flawed decision-making and reinforcing existing biases. |
Conservatism Bias | The tendency to revise one’s belief insufficiently when presented with new evidence. | May lead to clinging to outdated or incorrect beliefs despite contrary evidence, hindering effective decision-making and learning. |
Curse of Knowledge | When an individual’s own knowledge leads them to incorrectly assume that others have the same knowledge. | Can lead to ineffective communication and difficulty in understanding others’ perspectives, hindering collaboration and decision-making. |
Dunning-Kruger Effect | The phenomenon where people with low ability in a particular domain overestimate their ability, and those with high ability underestimate their ability. | Can lead to incompetence going unrecognized and potentially result in poor decision-making, particularly in areas where accurate self-assessment is crucial. |
Empathy Gap | The inability to understand or predict the emotional states or reactions of others, particularly when in a different emotional state. | May lead to misunderstandings, conflicts, and ineffective communication in personal and professional relationships. |
Fundamental Attribution Error | The tendency to attribute others’ behavior to internal factors (personality, disposition) while attributing our own behavior to external factors (situational influences). | Can result in unfair judgments and misunderstandings, particularly in interpersonal relationships and evaluations of others’ actions. |
Gambler’s Fallacy | The belief that the outcomes of random events are influenced by previous outcomes, leading to expectations of a “balancing” effect in the future. | Can lead to risky decision-making, such as in gambling or investment, where past outcomes are seen as predictive of future outcomes. |
Halo Effect | The tendency to judge a person or thing positively based on one positive characteristic or attribute. | Can lead to biased evaluations and decisions, as other traits or aspects may be overlooked or undervalued. |
Hindsight Bias | The tendency to perceive past events as having been more predictable than they actually were. | Can lead to overestimating one’s ability to predict outcomes and underestimating the role of chance or uncertainty in events. |
Illusion of Control | The tendency to overestimate one’s ability to control events or outcomes, even when such control is minimal or nonexistent. | Can lead to excessive risk-taking or failure to adequately prepare for potential negative outcomes. |
Illusory Superiority (Dunning-Kruger Effect) | The tendency of people to overestimate their abilities relative to others. | Can lead to individuals taking on tasks beyond their competence level or dismissing the expertise of others, resulting in poor decision-making and interpersonal conflicts. |
Impact Bias | The tendency to overestimate the intensity and duration of future emotional states. | Can lead to poor decision-making, as individuals may make choices based on inaccurate predictions of their future emotional reactions. |
In-group Bias | The tendency to favor individuals within one’s own group over those from outside the group. | Can lead to prejudice, discrimination, and favoritism, impacting decisions in social, professional, and political contexts. |
Information Bias | The tendency to seek information when it does not affect action. | Can lead to information overload, wasting time and resources on gathering data that doesn’t contribute to decision-making or action. |
Irrational Escalation (Sunk Cost Fallacy) | The tendency to continue investing in a failing endeavor because of the resources (time, money, effort) already invested. | Can lead to poor decision-making, as individuals prioritize past investments over future prospects, disregarding the actual costs and benefits. |
Loss Aversion | The tendency to prefer avoiding losses over acquiring equivalent gains. | Can lead to risk aversion and reluctance to take necessary risks for potential gains, hindering innovation and growth. |
Negativity Bias | The tendency to focus more on negative experiences or information than positive ones. | Can lead to disproportionate fear, anxiety, and pessimism, influencing decision-making and overall well-being. |
Normalcy Bias | The refusal to plan for, or react to, a disaster which has never happened before. | Can lead to underestimating the likelihood or severity of unexpected events, resulting in inadequate preparation and response. |
Observer-Expectancy Effect | When a researcher’s expectations about the outcome of a study influence the results obtained. | Can lead to biased interpretations of data and results, undermining the validity and reliability of research findings. |
Ostrich Effect | The decision to ignore dangerous or negative information by “burying” one’s head in the sand, like an ostrich. | Can lead to ignoring warning signs or risks, resulting in avoidable negative consequences or missed opportunities. |
Outcome Bias | The tendency to judge a decision based on its outcome rather than the quality of the decision at the time it was made. | Can lead to unfair evaluations and learning incorrect lessons from past experiences, particularly in situations with uncertain outcomes. |
Overconfidence Bias | The tendency to overestimate one’s own abilities, knowledge, or judgment. | Can lead to taking on tasks beyond one’s capability, making poor decisions, and underestimating risks, potentially resulting in failure. |
Pareidolia | The tendency to perceive a pattern, often an image or sound, where none exists. | Can lead to seeing faces in random objects or hearing messages in white noise, potentially resulting in misinterpretations and irrational beliefs. |
Peak-End Rule | The tendency to judge an experience largely based on how it was at its peak (best or worst) and how it ended, rather than the total sum or average of every moment of the experience. | Can influence decision-making by prioritizing memorable moments or endings over overall experiences, leading to skewed evaluations and choices. |
Planning Fallacy | The tendency to underestimate the time, costs, and risks of future actions while overestimating the benefits. | Can lead to project delays, budget overruns, and failure to achieve objectives due to inadequate planning and unrealistic expectations. |
Post-Purchase Rationalization | The tendency to justify a purchase or decision after it has been made, even if it was not the best choice. | Can lead to overlooking or downplaying negative aspects of a decision, reducing cognitive dissonance but hindering learning and improvement. |
Pro-innovation Bias | The tendency to have an excessive optimism towards an innovation or new technology, underestimating its limitations and potential negative consequences. | Can lead to overlooking risks and challenges associated with new technologies or innovations, potentially resulting in unexpected failures or negative impacts. |
Reactance | The urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice. | Can lead to resistance to persuasion, authority, or rules, hindering effective communication and cooperation. |
Recency Bias | The tendency to weigh the latest information more heavily than older data when making decisions. | Can lead to overlooking long-term trends or ignoring important historical data, resulting in suboptimal decisions. |
Regret Aversion | The tendency to avoid actions that may lead to regret or blame. | Can lead to missed opportunities or stagnation, as individuals may opt for safer choices even if they offer lower potential rewards. |
Restraint Bias | The tendency to overestimate one’s ability to show restraint in the face of temptation or pressure. | Can lead to underestimating the likelihood of succumbing to temptations or making impulsive decisions, resulting in self-control failures. |
Salience Bias | Focusing on the most noticeable or prominent information while ignoring less conspicuous factors. | Can lead to overlooking important but less apparent factors, resulting in incomplete or biased assessments and decisions. |
Selective Perception | The tendency to selectively interpret what one sees based on their interests, background, experience, and attitudes. | Can lead to misunderstandings, miscommunication, and biased judgments, particularly in situations where multiple interpretations are possible. |
Sunk Cost Fallacy | The belief that additional investment is justified in a failing endeavor, based on the cumulative prior investment (“sunk costs”), despite new evidence suggesting that the cost, starting today, of continuing the endeavor outweighs the expected benefit. | Can lead to irrational decision-making, as individuals focus on past losses rather than future prospects, resulting in further losses and missed opportunities. |
Connected Thinking Frameworks
Convergent vs. Divergent Thinking
Law of Unintended Consequences
Read Next: Biases, Bounded Rationality, Mandela Effect, Dunning-Kruger Effect, Lindy Effect, Crowding Out Effect, Bandwagon Effect.
Main Guides: