biases

What Are Biases Really And Why We Got It All Wrong About Biases

The concept of cognitive biases was introduced and popularized by the work of Amos Tversky and Daniel Kahneman since 1972. Biases are seen as systematic errors and flaws that make humans deviate from the standards of rationality, thus making us inept at making good decisions under uncertainty.

That is the conventional definition, let’s see what’s wrong about it and why we want to start from an alternative definition.

Why we got it all wrong about biases?

heuristic
As highlighted by German psychologist Gerd Gigerenzer in the paper “Heuristic Decision Making,” the term heuristic is of Greek origin, meaning “serving to find out or discover.” More precisely, a heuristic is a fast and accurate way to make decisions in the real world, which is driven by uncertainty.

In my previous article about heuristics, we saw why heuristics can be powerful thinking tools for business people dealing with uncertainty on a daily basis.

When I define heuristics though I’m not using the conventional definition (to be sure that is not the definition given by Kahneman) and we’ll see why that same definition is biased in the first place.

That makes us reconsider the whole thinking model business people, entrepreneurs, managers and all the practitioners out there have been influenced by (count me in).

That is why I decided to analyze the few flaws of the conventional way to look at biases and cognitive errors.

Let’s recap here some of the key points of what heuristics are really about and why those make sense for business people.

Then we’ll go through the core mistakes of the conventional view of biases and cognitive errors at the foundation of behavioral economics and much more.

Then we’ll ask the fundamental question: what’s next?

Context matters

When dealing with real-life scenarios we can relate to them based on the context we live. A Halloween custom wore during a casual Friday won’t look as odd as the same custom wore on a regular day.

Humans think in narrow contexts not because they are narrow-minded but primarily due to the fact that often a successful decision is based on surviving a specific situation.

At the same time, our minds are capable of understanding at a deep level (not logical, neither explainable) the subtleties of the real world, made of hidden costs, risks, and high uncertainty.

In this scenario, things that might seem irrational are not such if looked from a different perspective.

A classic example that gets cited often is about how humans are “loss averse” thus giving much more weight to a loss of $10 say compared to the same $10 gain.

For the modern psychologist, marketer or businessman that might seem irrational and a signal of the human mind’s limitations and stupidity.

However, in real-world scenarios, things are never so clean and clear. Often the problem is hidden, so hidden that being loss averse is just a natural, time-tested defense mechanism against possible screw-ups.

The whole Warren Buffett’s playbook can be summarized: “Rule No. 1: Never lose money. Rule No. 2: Don’t forget rule No. 1”.

The most amateur stock trader knows that losing money is way worse than gaining. If you start from a $100 investment and you lose 50% you end up with $50.

However, to go back to where you were, $100, you will need to gain 100%. In short, a 50% loss will call for a 100% to get back to the initial point.

Polymath Jared Diamond, in his book, The World Until Yesterday, talks about constructive paranoia.

He learned this concept when leaving with several tribes in New Guinea. For instance, those tribes had a cultural norm to avoid sleeping under big trees due to a seemingly irrational fear those might fall.

Indeed, there is a very low probability of that happening. However, if it does there is no way back, you’re dead.

In most real-life scenarios those potential losses carry hidden risks, which as they can’t be computed, are ignored by psychologists, but instead are not hidden to the human mind.

So better be paranoid than a dead smart person. Tribesmen know better while some modern psychologists have forgotten.

What if risk aversion is just a constructive paranoia? This is one of the many examples of how biases could be easily reframed.

A narrow definition of rationality

Modern psychologists have primarily looked at one side of rationality and assumed that’s all that is. This led to the mainstream acceptance of a distorted theory of mind, which focuses on the cognitive errors humans make devoid of any context which has led to an endless list of biases which, we stupid humans fall into.

While it is admirable to move from a psychological framework where humans are infallible to understanding and studying the flaws of our minds.

It is as bad to fall for the opposite thinking model, where the human mind is seen as just an artifact of an ancient time, which only carries errors because it can’t deal anymore with the modern world.

That is why in the last years one of the most used mantras in business, marketing, sales or any endeavor that deals with human behavior has been about “biases and cognitive fallacies” yet as we’ll see those fallacies are mostly rationality in the real world, applied contextually.

The fundamental Kanheman’s error

Scholars like Kahneman and Tversky have changed the way we think about how we think.

In the book “Thinking, Fast and Slow” Kahneman explains his whole career spent in understanding how humans deal with decision-making, especially in relation to uncertainty and whether humans are good “intuitive statisticians.”

As Kahneman’s work would show, people are not good intuitive statisticians, and a two-model thinking system drove our decision-making in the real world.

From these assumptions heuristics produced biases, and those biases, in turn, were systematic errors that made us irrational.

Later on, Kahneman would draw a more balanced view for which judgment and choices aren’t just based on heuristics but also on skills.

Thus, biases would also be the result of the expert overconfidence, or the fact that the more skills you acquire in certain fields the more you become confident about them, thus fall into cognitive biases.

Kahneman’s work has led to infinite lists of human irrationality, humans’ complete inadequacy in having a clear picture of the real world and our inability to deal with logic.

From psychology, straight into economics, decision-making and any other endeavor related to human behavior (marketing, sales, entrepreneurship and more) these have become the dominant thinking models.

Yet this view is extremely narrow and it leads to the opposite excess. Psychologists and practitioners become producers of an infinite list of biases that grow every day to show how irrational we are.

While this production has some literary value, it doesn’t carry any value for the business person trying to make things work in the real world. If at all, that view can be limiting and damaging.

Redefining biases

Some of the fundamental errors are the following:

  • Out of context: the problem of the currently dominant theories around biases is the focus on the behavioral aspects (how we say we would act in a certain hypothetical scenario or how we act in completely noncontextual scenarios) vs. how we really act in a specific real context.
  • What is rationality, really? If we define rationality as the ability to follow logical rules, then we are all irrational. If we redefine rationality as the ability to survive specific context-driven situations, then something like risk aversion can be reframed as constructive paranoia. Therefore something that we used to see as a cognitive error, becomes a defense/survival mechanism given the asymmetry of risk-taking and the fact that certain hidden risks can’t be known, or can be known only in hindsight.
  • Do skills really create biases? Another limited view is the fact that skills cause biases. I think the problem is not of skills but whether in certain domains skills can be acquired at all. In certain areas, think of sports, the more you train, and you do it in a deliberate way, the better you become. In other areas, like entrepreneurship and business in general, building skills is trickier. Each situation and scenario will have its own subtleties and experience (not skils) make us act in certain ways that we can’t even explain. Yet can we call that a skill?
  • Are biases really biases? By following what’s above you can understand that biases aren’t so if looked through the lenses of a different definition of rationality.

If you agree with all the points above, does it still make sense to keep using this thinking model?

What’s next? Beyond the “bias bias” and into the real-world decision making

Gerd Gigerenzer, in “The Bias Bias in Behavioral Economics” explains how Kahneman’s work has led to the tendency to “spot biases even when there are none.”

As Gigerenzer explained people ” have largely fine-tuned intuitions about chance, frequency, and framing.”

Thus showing little evidence about the fact that biases lead to any cost at all. Therefore, each time you see a bias proposed by psychologists you might want to keep a skeptical eye and trust your fine-tuned intuition, and acquired experience as a business person!

References:

  • The Bias Bias in Behavioral Economics, Review of Behavioral Economics, 2018, Gerd Gigerenzer
  • Heuristic Decision Making, Gerd Gigerenzer and Wolfgang Gaissmaier, Annu. Rev. Psychol. 2011. 62:451–82
  • Judgment Under Uncertainty, Heuristics, and Biases, Amos Tversky and Daniel Kahneman
  • Thinking, Fast and Slow, by Daniel Kahneman
  • Risk Savvy: How to Make Good DecisionsBook by Gerd Gigerenzer

Key Highlights

  • Introduction of Cognitive Biases: Cognitive biases were introduced by the work of Amos Tversky and Daniel Kahneman in 1972. They refer to systematic errors that lead humans to deviate from rational decision-making under uncertainty.
  • Reframing Biases as Heuristics: Gerd Gigerenzer emphasizes that the term “heuristic” is rooted in Greek and signifies a fast and accurate way to make decisions in uncertain real-world situations. Biases, rather than being errors, are heuristics that guide decision-making contextually.
  • Contextual Decision-Making: Humans tend to think in narrow contexts because successful decisions often relate to specific situations. This is a natural survival strategy where hidden costs, risks, and uncertainty play significant roles.
  • Loss Aversion and Constructive Paranoia: Loss aversion, often seen as irrational, can be reframed as a natural defense mechanism against hidden risks and potential screw-ups. Contextually, the aversion to losses can be seen as constructive paranoia, a defense mechanism against uncertain outcomes.
  • Narrow Definition of Rationality: Modern psychology focused on cognitive errors while ignoring the context, leading to a distorted view of rationality. Both extremes, viewing humans as infallible or inherently flawed, hinder understanding of decision-making in the real world.
  • Kahneman’s Approach: Kahneman’s work led to the identification of biases and systematic errors, presenting humans as irrational. While insightful, this approach disregards contextual skills and judgment.
  • Redefining Biases: Biases can be redefined as contextually driven heuristics that aid survival and decision-making in specific scenarios. This reframing challenges the traditional biases perspective.
  • Skills and Biases: Not all skills lead to biases. Skills in different domains have varying impacts on decision-making. In areas like entrepreneurship, experience shapes behavior more than skills.
  • Are Biases Really Biases?: Through the lens of reframing, biases cease to be biases in the traditional sense. They become adaptive responses to context and uncertainty.
  • Beyond the “Bias Bias”: The tendency to spot biases even where they may not exist has led to a “bias bias.” Fine-tuned intuitions and experience play a crucial role in decision-making, especially for business practitioners.
  • Next Steps: Gigerenzer’s work questions the prevalent thinking model of biases and cognitive fallacies. Instead, he advocates for understanding heuristics within real-world decision-making contexts.

BiasDescriptionImplication
Anchoring BiasThe tendency to rely too heavily on the first piece of information encountered when making decisions.Can lead to decisions being influenced disproportionately by initial information, even if it’s irrelevant or misleading.
Confirmation BiasThe tendency to search for, interpret, favor, and recall information in a way that confirms one’s preexisting beliefs or hypotheses.Can result in overlooking contradictory evidence, leading to flawed decision-making and reinforcing existing biases.
Availability HeuristicEstimating the likelihood of events based on their availability in memory; the more readily available something is, the more likely it is judged to be true.May lead to overestimating the probability of vivid or recent events, while underestimating the probability of less memorable events.
Bandwagon EffectThe tendency to do or believe things because many other people do or believe the same.Can lead to conformity without critical evaluation, potentially resulting in irrational decisions or actions.
Base Rate FallacyIgnoring statistical information about general principles in favor of singular cases or specific details.Can result in inaccurate judgments and decisions by failing to consider relevant background information or probabilities.
Belief BiasEvaluating the strength of an argument based on the believability of its conclusion, rather than its validity.Can lead to accepting weak arguments if the conclusion aligns with one’s beliefs, regardless of the logic or evidence presented.
Choice-Supportive BiasThe tendency to remember one’s choices as better than they actually were.May lead to overestimating the value of past decisions, potentially influencing future decision-making inappropriately.
Clustering IllusionThe tendency to see patterns or significance in random or meaningless data.Can result in seeing trends or connections where none exist, leading to faulty interpretations and decisions.
Confirmation BiasThe tendency to search for, interpret, favor, and recall information in a way that confirms one’s preexisting beliefs or hypotheses.Can result in overlooking contradictory evidence, leading to flawed decision-making and reinforcing existing biases.
Conservatism BiasThe tendency to revise one’s belief insufficiently when presented with new evidence.May lead to clinging to outdated or incorrect beliefs despite contrary evidence, hindering effective decision-making and learning.
Curse of KnowledgeWhen an individual’s own knowledge leads them to incorrectly assume that others have the same knowledge.Can lead to ineffective communication and difficulty in understanding others’ perspectives, hindering collaboration and decision-making.
Dunning-Kruger EffectThe phenomenon where people with low ability in a particular domain overestimate their ability, and those with high ability underestimate their ability.Can lead to incompetence going unrecognized and potentially result in poor decision-making, particularly in areas where accurate self-assessment is crucial.
Empathy GapThe inability to understand or predict the emotional states or reactions of others, particularly when in a different emotional state.May lead to misunderstandings, conflicts, and ineffective communication in personal and professional relationships.
Fundamental Attribution ErrorThe tendency to attribute others’ behavior to internal factors (personality, disposition) while attributing our own behavior to external factors (situational influences).Can result in unfair judgments and misunderstandings, particularly in interpersonal relationships and evaluations of others’ actions.
Gambler’s FallacyThe belief that the outcomes of random events are influenced by previous outcomes, leading to expectations of a “balancing” effect in the future.Can lead to risky decision-making, such as in gambling or investment, where past outcomes are seen as predictive of future outcomes.
Halo EffectThe tendency to judge a person or thing positively based on one positive characteristic or attribute.Can lead to biased evaluations and decisions, as other traits or aspects may be overlooked or undervalued.
Hindsight BiasThe tendency to perceive past events as having been more predictable than they actually were.Can lead to overestimating one’s ability to predict outcomes and underestimating the role of chance or uncertainty in events.
Illusion of ControlThe tendency to overestimate one’s ability to control events or outcomes, even when such control is minimal or nonexistent.Can lead to excessive risk-taking or failure to adequately prepare for potential negative outcomes.
Illusory Superiority (Dunning-Kruger Effect)The tendency of people to overestimate their abilities relative to others.Can lead to individuals taking on tasks beyond their competence level or dismissing the expertise of others, resulting in poor decision-making and interpersonal conflicts.
Impact BiasThe tendency to overestimate the intensity and duration of future emotional states.Can lead to poor decision-making, as individuals may make choices based on inaccurate predictions of their future emotional reactions.
In-group BiasThe tendency to favor individuals within one’s own group over those from outside the group.Can lead to prejudice, discrimination, and favoritism, impacting decisions in social, professional, and political contexts.
Information BiasThe tendency to seek information when it does not affect action.Can lead to information overload, wasting time and resources on gathering data that doesn’t contribute to decision-making or action.
Irrational Escalation (Sunk Cost Fallacy)The tendency to continue investing in a failing endeavor because of the resources (time, money, effort) already invested.Can lead to poor decision-making, as individuals prioritize past investments over future prospects, disregarding the actual costs and benefits.
Loss AversionThe tendency to prefer avoiding losses over acquiring equivalent gains.Can lead to risk aversion and reluctance to take necessary risks for potential gains, hindering innovation and growth.
Negativity BiasThe tendency to focus more on negative experiences or information than positive ones.Can lead to disproportionate fear, anxiety, and pessimism, influencing decision-making and overall well-being.
Normalcy BiasThe refusal to plan for, or react to, a disaster which has never happened before.Can lead to underestimating the likelihood or severity of unexpected events, resulting in inadequate preparation and response.
Observer-Expectancy EffectWhen a researcher’s expectations about the outcome of a study influence the results obtained.Can lead to biased interpretations of data and results, undermining the validity and reliability of research findings.
Ostrich EffectThe decision to ignore dangerous or negative information by “burying” one’s head in the sand, like an ostrich.Can lead to ignoring warning signs or risks, resulting in avoidable negative consequences or missed opportunities.
Outcome BiasThe tendency to judge a decision based on its outcome rather than the quality of the decision at the time it was made.Can lead to unfair evaluations and learning incorrect lessons from past experiences, particularly in situations with uncertain outcomes.
Overconfidence BiasThe tendency to overestimate one’s own abilities, knowledge, or judgment.Can lead to taking on tasks beyond one’s capability, making poor decisions, and underestimating risks, potentially resulting in failure.
PareidoliaThe tendency to perceive a pattern, often an image or sound, where none exists.Can lead to seeing faces in random objects or hearing messages in white noise, potentially resulting in misinterpretations and irrational beliefs.
Peak-End RuleThe tendency to judge an experience largely based on how it was at its peak (best or worst) and how it ended, rather than the total sum or average of every moment of the experience.Can influence decision-making by prioritizing memorable moments or endings over overall experiences, leading to skewed evaluations and choices.
Planning FallacyThe tendency to underestimate the time, costs, and risks of future actions while overestimating the benefits.Can lead to project delays, budget overruns, and failure to achieve objectives due to inadequate planning and unrealistic expectations.
Post-Purchase RationalizationThe tendency to justify a purchase or decision after it has been made, even if it was not the best choice.Can lead to overlooking or downplaying negative aspects of a decision, reducing cognitive dissonance but hindering learning and improvement.
Pro-innovation BiasThe tendency to have an excessive optimism towards an innovation or new technology, underestimating its limitations and potential negative consequences.Can lead to overlooking risks and challenges associated with new technologies or innovations, potentially resulting in unexpected failures or negative impacts.
ReactanceThe urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice.Can lead to resistance to persuasion, authority, or rules, hindering effective communication and cooperation.
Recency BiasThe tendency to weigh the latest information more heavily than older data when making decisions.Can lead to overlooking long-term trends or ignoring important historical data, resulting in suboptimal decisions.
Regret AversionThe tendency to avoid actions that may lead to regret or blame.Can lead to missed opportunities or stagnation, as individuals may opt for safer choices even if they offer lower potential rewards.
Restraint BiasThe tendency to overestimate one’s ability to show restraint in the face of temptation or pressure.Can lead to underestimating the likelihood of succumbing to temptations or making impulsive decisions, resulting in self-control failures.
Salience BiasFocusing on the most noticeable or prominent information while ignoring less conspicuous factors.Can lead to overlooking important but less apparent factors, resulting in incomplete or biased assessments and decisions.
Selective PerceptionThe tendency to selectively interpret what one sees based on their interests, background, experience, and attitudes.Can lead to misunderstandings, miscommunication, and biased judgments, particularly in situations where multiple interpretations are possible.
Sunk Cost FallacyThe belief that additional investment is justified in a failing endeavor, based on the cumulative prior investment (“sunk costs”), despite new evidence suggesting that the cost, starting today, of continuing the endeavor outweighs the expected benefit.Can lead to irrational decision-making, as individuals focus on past losses rather than future prospects, resulting in further losses and missed opportunities.

Connected Thinking Frameworks

Convergent vs. Divergent Thinking

convergent-vs-divergent-thinking
Convergent thinking occurs when the solution to a problem can be found by applying established rules and logical reasoning. Whereas divergent thinking is an unstructured problem-solving method where participants are encouraged to develop many innovative ideas or solutions to a given problem. Where convergent thinking might work for larger, mature organizations where divergent thinking is more suited for startups and innovative companies.

Critical Thinking

critical-thinking
Critical thinking involves analyzing observations, facts, evidence, and arguments to form a judgment about what someone reads, hears, says, or writes.

Biases

biases
The concept of cognitive biases was introduced and popularized by the work of Amos Tversky and Daniel Kahneman in 1972. Biases are seen as systematic errors and flaws that make humans deviate from the standards of rationality, thus making us inept at making good decisions under uncertainty.

Second-Order Thinking

second-order-thinking
Second-order thinking is a means of assessing the implications of our decisions by considering future consequences. Second-order thinking is a mental model that considers all future possibilities. It encourages individuals to think outside of the box so that they can prepare for every and eventuality. It also discourages the tendency for individuals to default to the most obvious choice.

Lateral Thinking

lateral-thinking
Lateral thinking is a business strategy that involves approaching a problem from a different direction. The strategy attempts to remove traditionally formulaic and routine approaches to problem-solving by advocating creative thinking, therefore finding unconventional ways to solve a known problem. This sort of non-linear approach to problem-solving, can at times, create a big impact.

Bounded Rationality

bounded-rationality
Bounded rationality is a concept attributed to Herbert Simon, an economist and political scientist interested in decision-making and how we make decisions in the real world. In fact, he believed that rather than optimizing (which was the mainstream view in the past decades) humans follow what he called satisficing.

Dunning-Kruger Effect

dunning-kruger-effect
The Dunning-Kruger effect describes a cognitive bias where people with low ability in a task overestimate their ability to perform that task well. Consumers or businesses that do not possess the requisite knowledge make bad decisions. What’s more, knowledge gaps prevent the person or business from seeing their mistakes.

Occam’s Razor

occams-razor
Occam’s Razor states that one should not increase (beyond reason) the number of entities required to explain anything. All things being equal, the simplest solution is often the best one. The principle is attributed to 14th-century English theologian William of Ockham.

Lindy Effect

lindy-effect
The Lindy Effect is a theory about the ageing of non-perishable things, like technology or ideas. Popularized by author Nicholas Nassim Taleb, the Lindy Effect states that non-perishable things like technology age – linearly – in reverse. Therefore, the older an idea or a technology, the same will be its life expectancy.

Antifragility

antifragility
Antifragility was first coined as a term by author, and options trader Nassim Nicholas Taleb. Antifragility is a characteristic of systems that thrive as a result of stressors, volatility, and randomness. Therefore, Antifragile is the opposite of fragile. Where a fragile thing breaks up to volatility; a robust thing resists volatility. An antifragile thing gets stronger from volatility (provided the level of stressors and randomness doesn’t pass a certain threshold).

Systems Thinking

systems-thinking
Systems thinking is a holistic means of investigating the factors and interactions that could contribute to a potential outcome. It is about thinking non-linearly, and understanding the second-order consequences of actions and input into the system.

Vertical Thinking

vertical-thinking
Vertical thinking, on the other hand, is a problem-solving approach that favors a selective, analytical, structured, and sequential mindset. The focus of vertical thinking is to arrive at a reasoned, defined solution.

Maslow’s Hammer

einstellung-effect
Maslow’s Hammer, otherwise known as the law of the instrument or the Einstellung effect, is a cognitive bias causing an over-reliance on a familiar tool. This can be expressed as the tendency to overuse a known tool (perhaps a hammer) to solve issues that might require a different tool. This problem is persistent in the business world where perhaps known tools or frameworks might be used in the wrong context (like business plans used as planning tools instead of only investors’ pitches).

Peter Principle

peter-principle
The Peter Principle was first described by Canadian sociologist Lawrence J. Peter in his 1969 book The Peter Principle. The Peter Principle states that people are continually promoted within an organization until they reach their level of incompetence.

Straw Man Fallacy

straw-man-fallacy
The straw man fallacy describes an argument that misrepresents an opponent’s stance to make rebuttal more convenient. The straw man fallacy is a type of informal logical fallacy, defined as a flaw in the structure of an argument that renders it invalid.

Streisand Effect

streisand-effect
The Streisand Effect is a paradoxical phenomenon where the act of suppressing information to reduce visibility causes it to become more visible. In 2003, Streisand attempted to suppress aerial photographs of her Californian home by suing photographer Kenneth Adelman for an invasion of privacy. Adelman, who Streisand assumed was paparazzi, was instead taking photographs to document and study coastal erosion. In her quest for more privacy, Streisand’s efforts had the opposite effect.

Heuristic

heuristic
As highlighted by German psychologist Gerd Gigerenzer in the paper “Heuristic Decision Making,” the term heuristic is of Greek origin, meaning “serving to find out or discover.” More precisely, a heuristic is a fast and accurate way to make decisions in the real world, which is driven by uncertainty.

Recognition Heuristic

recognition-heuristic
The recognition heuristic is a psychological model of judgment and decision making. It is part of a suite of simple and economical heuristics proposed by psychologists Daniel Goldstein and Gerd Gigerenzer. The recognition heuristic argues that inferences are made about an object based on whether it is recognized or not.

Representativeness Heuristic

representativeness-heuristic
The representativeness heuristic was first described by psychologists Daniel Kahneman and Amos Tversky. The representativeness heuristic judges the probability of an event according to the degree to which that event resembles a broader class. When queried, most will choose the first option because the description of John matches the stereotype we may hold for an archaeologist.

Take-The-Best Heuristic

take-the-best-heuristic
The take-the-best heuristic is a decision-making shortcut that helps an individual choose between several alternatives. The take-the-best (TTB) heuristic decides between two or more alternatives based on a single good attribute, otherwise known as a cue. In the process, less desirable attributes are ignored.

Bundling Bias

bundling-bias
The bundling bias is a cognitive bias in e-commerce where a consumer tends not to use all of the products bought as a group, or bundle. Bundling occurs when individual products or services are sold together as a bundle. Common examples are tickets and experiences. The bundling bias dictates that consumers are less likely to use each item in the bundle. This means that the value of the bundle and indeed the value of each item in the bundle is decreased.

Barnum Effect

barnum-effect
The Barnum Effect is a cognitive bias where individuals believe that generic information – which applies to most people – is specifically tailored for themselves.

First-Principles Thinking

first-principles-thinking
First-principles thinking – sometimes called reasoning from first principles – is used to reverse-engineer complex problems and encourage creativity. It involves breaking down problems into basic elements and reassembling them from the ground up. Elon Musk is among the strongest proponents of this way of thinking.

Ladder Of Inference

ladder-of-inference
The ladder of inference is a conscious or subconscious thinking process where an individual moves from a fact to a decision or action. The ladder of inference was created by academic Chris Argyris to illustrate how people form and then use mental models to make decisions.

Goodhart’s Law

goodharts-law
Goodhart’s Law is named after British monetary policy theorist and economist Charles Goodhart. Speaking at a conference in Sydney in 1975, Goodhart said that “any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.” Goodhart’s Law states that when a measure becomes a target, it ceases to be a good measure.

Six Thinking Hats Model

six-thinking-hats-model
The Six Thinking Hats model was created by psychologist Edward de Bono in 1986, who noted that personality type was a key driver of how people approached problem-solving. For example, optimists view situations differently from pessimists. Analytical individuals may generate ideas that a more emotional person would not, and vice versa.

Mandela Effect

mandela-effect
The Mandela effect is a phenomenon where a large group of people remembers an event differently from how it occurred. The Mandela effect was first described in relation to Fiona Broome, who believed that former South African President Nelson Mandela died in prison during the 1980s. While Mandela was released from prison in 1990 and died 23 years later, Broome remembered news coverage of his death in prison and even a speech from his widow. Of course, neither event occurred in reality. But Broome was later to discover that she was not the only one with the same recollection of events.

Crowding-Out Effect

crowding-out-effect
The crowding-out effect occurs when public sector spending reduces spending in the private sector.

Bandwagon Effect

bandwagon-effect
The bandwagon effect tells us that the more a belief or idea has been adopted by more people within a group, the more the individual adoption of that idea might increase within the same group. This is the psychological effect that leads to herd mentality. What in marketing can be associated with social proof.

Moore’s Law

moores-law
Moore’s law states that the number of transistors on a microchip doubles approximately every two years. This observation was made by Intel co-founder Gordon Moore in 1965 and it become a guiding principle for the semiconductor industry and has had far-reaching implications for technology as a whole.

Disruptive Innovation

disruptive-innovation
Disruptive innovation as a term was first described by Clayton M. Christensen, an American academic and business consultant whom The Economist called “the most influential management thinker of his time.” Disruptive innovation describes the process by which a product or service takes hold at the bottom of a market and eventually displaces established competitors, products, firms, or alliances.

Value Migration

value-migration
Value migration was first described by author Adrian Slywotzky in his 1996 book Value Migration – How to Think Several Moves Ahead of the Competition. Value migration is the transferal of value-creating forces from outdated business models to something better able to satisfy consumer demands.

Bye-Now Effect

bye-now-effect
The bye-now effect describes the tendency for consumers to think of the word “buy” when they read the word “bye”. In a study that tracked diners at a name-your-own-price restaurant, each diner was asked to read one of two phrases before ordering their meal. The first phrase, “so long”, resulted in diners paying an average of $32 per meal. But when diners recited the phrase “bye bye” before ordering, the average price per meal rose to $45.

Groupthink

groupthink
Groupthink occurs when well-intentioned individuals make non-optimal or irrational decisions based on a belief that dissent is impossible or on a motivation to conform. Groupthink occurs when members of a group reach a consensus without critical reasoning or evaluation of the alternatives and their consequences.

Stereotyping

stereotyping
A stereotype is a fixed and over-generalized belief about a particular group or class of people. These beliefs are based on the false assumption that certain characteristics are common to every individual residing in that group. Many stereotypes have a long and sometimes controversial history and are a direct consequence of various political, social, or economic events. Stereotyping is the process of making assumptions about a person or group of people based on various attributes, including gender, race, religion, or physical traits.

Murphy’s Law

murphys-law
Murphy’s Law states that if anything can go wrong, it will go wrong. Murphy’s Law was named after aerospace engineer Edward A. Murphy. During his time working at Edwards Air Force Base in 1949, Murphy cursed a technician who had improperly wired an electrical component and said, “If there is any way to do it wrong, he’ll find it.”

Law of Unintended Consequences

law-of-unintended-consequences
The law of unintended consequences was first mentioned by British philosopher John Locke when writing to parliament about the unintended effects of interest rate rises. However, it was popularized in 1936 by American sociologist Robert K. Merton who looked at unexpected, unanticipated, and unintended consequences and their impact on society.

Fundamental Attribution Error

fundamental-attribution-error
Fundamental attribution error is a bias people display when judging the behavior of others. The tendency is to over-emphasize personal characteristics and under-emphasize environmental and situational factors.

Outcome Bias

outcome-bias
Outcome bias describes a tendency to evaluate a decision based on its outcome and not on the process by which the decision was reached. In other words, the quality of a decision is only determined once the outcome is known. Outcome bias occurs when a decision is based on the outcome of previous events without regard for how those events developed.

Hindsight Bias

hindsight-bias
Hindsight bias is the tendency for people to perceive past events as more predictable than they actually were. The result of a presidential election, for example, seems more obvious when the winner is announced. The same can also be said for the avid sports fan who predicted the correct outcome of a match regardless of whether their team won or lost. Hindsight bias, therefore, is the tendency for an individual to convince themselves that they accurately predicted an event before it happened.

Read Next: BiasesBounded RationalityMandela EffectDunning-Kruger EffectLindy EffectCrowding Out EffectBandwagon Effect.

Main Guides:

Discover more from FourWeekMBA

Subscribe now to keep reading and get access to the full archive.

Continue reading

Scroll to Top
FourWeekMBA