planning-fallacy

Planning Fallacy

The Planning Fallacy refers to the human tendency to be overly optimistic when planning projects, underestimating time, and resources required. It often leads to project delays, disappointment, and resource misallocation. Despite its challenges, it can motivate individuals. Examples include construction projects running behind schedule and underestimating academic assignment completion times.

The Psychology Behind the Planning Fallacy

The planning fallacy can be understood through several psychological factors:

  • Optimism Bias: People tend to be overly optimistic about their own abilities and the outcomes of their endeavors. This bias can lead to underestimating the challenges and complexities involved in a project.
  • Positive Outcome Expectations: Individuals often focus on the potential positive outcomes of a project while downplaying or ignoring potential setbacks or delays. This positive outcome expectation can lead to unrealistic planning.
  • Confirmation Bias: People may seek and give more weight to information that supports their overly optimistic estimates while neglecting contradictory data.
  • Lack of Reference Class: Individuals often fail to consider similar past projects or experiences when making estimates, instead relying on idealized scenarios.

Implications of the Planning Fallacy

The planning fallacy can have significant consequences:

  • Missed Deadlines: Underestimating project timelines can lead to missed deadlines and project delays.
  • Budget Overruns: Unrealistic cost estimates can result in budget overruns, increasing financial strain on projects.
  • Resource Misallocation: Poor planning can lead to the inefficient allocation of resources, both human and financial.
  • Reduced Quality: Rushing to meet unrealistic deadlines may compromise the quality of the final product or result in corner-cutting.

Real-World Examples of the Planning Fallacy

The planning fallacy is evident in various real-world scenarios:

  • Construction Projects: Large-scale construction projects often experience delays and cost overruns due to overly optimistic initial estimates.
  • Software Development: Software development projects frequently face timeline extensions as developers underestimate the time required for coding, debugging, and testing.
  • Public Infrastructure: Public infrastructure projects, such as the construction of bridges or highways, often encounter delays and budget issues.

Mitigating the Planning Fallacy

Efforts to mitigate the planning fallacy can lead to more accurate predictions and better decision-making:

  • Reference Class Forecasting: Use historical data and past experiences from similar projects as a reference class to make more realistic estimates.
  • Expert Input: Consult with experts or individuals who have relevant experience to gain insights into potential challenges and required timelines.
  • Scenario Analysis: Develop multiple scenarios that consider both optimistic and pessimistic outcomes, helping to establish a range of possible project outcomes.
  • External Review: Seek external reviews or third-party assessments of project plans to provide an unbiased perspective.

Examples of the Planning Fallacy:

  • Construction Projects:
    • In construction, it’s common for projects to take longer and cost more than initially estimated. The planning fallacy often leads to delays in project completion and budget overruns. Factors such as weather, unforeseen structural issues, and regulatory approvals can contribute to these delays.
  • Product Development:
    • When companies plan to develop and launch new products, they frequently underestimate the time and resources required. This can result in products being released later than anticipated, missing market opportunities, and incurring additional costs during development.
  • Academic Assignments:
    • Students often underestimate the time needed to complete academic assignments. This leads to last-minute rushes to finish projects, lower-quality work, and increased stress. Despite past experiences, many students continue to make overly optimistic estimates.

Key Highlights of the Planning Fallacy:

  • Over-Optimism: Individuals tend to underestimate the time, effort, and resources needed for tasks or projects.
  • Ignoring Past Experience: The planning fallacy often involves disregarding lessons learned from previous similar projects, assuming the current project will proceed more smoothly.
  • Focus on Best-Case Scenarios: People tend to emphasize the most optimistic outcomes, overlooking potential obstacles and setbacks.
  • Project Delays: The planning fallacy frequently leads to project delays, which can be costly and frustrating.
  • Resource Misallocation: Misallocating resources due to inaccurate planning can hinder the efficient use of time and money.
  • Motivation: While the planning fallacy can lead to problems, its optimism can also motivate individuals to start projects they might otherwise find daunting.

FrameworkDescriptionWhen to Apply
Optimism BiasOptimism Bias: Optimism bias is a cognitive bias characterized by individuals’ tendency to underestimate the likelihood of negative events and overestimate the likelihood of positive events. It leads people to be overly optimistic about the outcomes of their plans and projects, often overlooking potential risks and challenges. The planning fallacy is closely related to optimism bias, as individuals tend to underestimate the time, resources, and effort required to complete tasks or achieve goals due to their overly optimistic outlook. Recognizing the influence of optimism bias can help individuals and organizations mitigate the planning fallacy by taking into account the possibility of setbacks, delays, and unforeseen obstacles when making plans and decisions. Strategies such as scenario planning, risk analysis, and contingency planning can help counteract optimism bias and improve the accuracy of forecasts and projections. By acknowledging the role of optimism bias, planners, project managers, and decision-makers can adopt more realistic and prudent approaches to planning and execution, reducing the likelihood of schedule overruns, budget overruns, and project failures.Mitigating the planning fallacy by taking into account the possibility of setbacks, delays, and unforeseen obstacles when making plans and decisions, in project management, strategic planning, or organizational management contexts where individuals aim to improve the accuracy of forecasts and projections, in designing strategies or approaches that counteract optimism bias and promote more realistic and prudent planning and execution, in implementing techniques or methodologies that incorporate risk analysis, scenario planning, or contingency planning to address the potential impact of unforeseen events on project timelines and outcomes.
Reference Class ForecastingReference Class Forecasting: Reference class forecasting is a method for predicting the outcomes of future projects or events by comparing them to similar past experiences or reference classes. It involves identifying analogous projects or situations and using their historical data to estimate the likely duration, cost, and outcomes of the current project or endeavor. Reference class forecasting helps counteract the planning fallacy by providing a more realistic basis for planning and decision-making, grounded in empirical evidence rather than subjective optimism. By leveraging reference class forecasting, planners and decision-makers can avoid the pitfalls of overconfidence and wishful thinking inherent in the planning fallacy and improve the accuracy of project estimates and projections. Techniques such as historical analysis, benchmarking, and expert judgment can enhance the effectiveness of reference class forecasting and support more informed and objective decision-making. By incorporating reference class forecasting into their planning processes, organizations and individuals can better anticipate potential challenges, manage expectations, and allocate resources effectively, reducing the likelihood of schedule delays, cost overruns, and project failures.Improving the accuracy of project estimates and projections by leveraging reference class forecasting to provide a more realistic basis for planning and decision-making, in project management, investment analysis, or policy evaluation contexts where individuals aim to anticipate future outcomes and allocate resources effectively, in adopting techniques or methodologies that incorporate historical analysis, benchmarking, or expert judgment to inform project planning and forecasting, in implementing strategies or approaches that help counteract the planning fallacy and mitigate the risks associated with overoptimistic projections and unrealistic expectations.
Scenario PlanningScenario Planning: Scenario planning is a strategic foresight method that involves creating and analyzing multiple plausible future scenarios to inform decision-making and strategy development. It helps decision-makers anticipate and prepare for a range of possible outcomes, including both optimistic and pessimistic scenarios. Scenario planning addresses the planning fallacy by encouraging individuals and organizations to consider alternative futures and assess the potential implications of different assumptions and uncertainties. By exploring diverse scenarios and their associated risks and opportunities, planners can develop more robust strategies and contingency plans that account for uncertainty and complexity. Scenario planning fosters adaptive decision-making and resilience by enabling stakeholders to identify early warning signs, adjust their plans dynamically, and capitalize on emerging opportunities. Techniques such as scenario workshops, horizon scanning, and cross-impact analysis can enhance the effectiveness of scenario planning and support strategic agility and preparedness. By integrating scenario planning into their decision-making processes, organizations can enhance their ability to navigate uncertainty, mitigate risks, and achieve their long-term objectives despite the challenges posed by the planning fallacy.Developing more robust strategies and contingency plans that account for uncertainty and complexity by exploring diverse scenarios and their associated risks and opportunities, in strategic planning, risk management, or innovation strategy contexts where organizations aim to anticipate future trends and challenges, in implementing techniques or methodologies that foster adaptive decision-making and resilience by enabling stakeholders to identify early warning signs and adjust their plans dynamically, in adopting strategies or approaches that promote strategic agility and preparedness by integrating scenario planning into decision-making processes and organizational culture.
Real Options TheoryReal Options Theory: Real options theory is an approach to decision-making under uncertainty that treats strategic choices as options rather than fixed commitments. It allows decision-makers to delay, adjust, or abandon investment decisions based on new information and changing circumstances. Real options theory addresses the planning fallacy by recognizing the value of flexibility and agility in response to uncertain and evolving environments. Instead of committing to a single course of action upfront, decision-makers can maintain strategic flexibility and preserve future options, reducing the risk of costly mistakes and sunk costs associated with overoptimistic plans. Real options theory emphasizes the importance of active decision management and the strategic value of deferring irreversible commitments until more information becomes available. By adopting real options thinking, organizations can make more informed and adaptive decisions, capitalize on emerging opportunities, and mitigate the negative consequences of the planning fallacy. Techniques such as option valuation, decision trees, and scenario analysis can facilitate the application of real options theory in strategic planning and investment decision-making.Making more informed and adaptive decisions by adopting real options thinking to preserve strategic flexibility and capitalize on emerging opportunities, in investment analysis, capital budgeting, or strategic decision-making contexts where uncertainty and ambiguity require a flexible approach to decision management, in implementing techniques or methodologies that facilitate the application of real options theory, such as option valuation, decision trees, or scenario analysis, in adopting strategies or approaches that promote active decision management and strategic agility by embracing the principles of real options theory and deferring irreversible commitments until more information becomes available.
Contingency PlanningContingency Planning: Contingency planning involves preparing alternative courses of action to address unexpected events or disruptions that may impact the achievement of goals or objectives. It aims to mitigate the potential negative consequences of unforeseen circumstances and ensure business continuity in the face of adversity. Contingency planning helps counteract the planning fallacy by acknowledging the inherent uncertainty and volatility of the external environment and proactively preparing for various contingencies. By identifying potential risks, vulnerabilities, and trigger events, organizations can develop response strategies and recovery plans that minimize the impact of disruptions on operations and performance. Contingency planning fosters resilience and adaptive capacity by enabling organizations to anticipate and adapt to changing conditions, reducing the likelihood of catastrophic failures or crises. Techniques such as risk assessment, scenario analysis, and business impact analysis can support the development of effective contingency plans and enhance organizational preparedness for unforeseen events. By integrating contingency planning into their risk management and strategic planning processes, organizations can enhance their ability to navigate uncertainty and maintain operational stability despite the challenges posed by the planning fallacy.Preparing for various contingencies and ensuring business continuity by developing response strategies and recovery plans that minimize the impact of disruptions, in risk management, business continuity planning, or crisis management contexts where organizations aim to anticipate and mitigate potential threats to their operations, in implementing techniques or methodologies that support the development of effective contingency plans, such as risk assessment, scenario analysis, or business impact analysis, in adopting strategies or approaches that promote resilience and adaptive capacity by integrating contingency planning into risk management and strategic planning processes and enhancing organizational preparedness for unforeseen events.
Probabilistic ForecastingProbabilistic Forecasting: Probabilistic forecasting involves estimating the likelihood of future events or outcomes based on statistical analysis and probability theory. It provides decision-makers with probabilistic estimates and confidence intervals rather than deterministic projections, allowing them to assess the range of possible outcomes and associated uncertainties. Probabilistic forecasting addresses the planning fallacy by incorporating probabilistic thinking and uncertainty quantification into decision-making processes. Instead of relying on single-point forecasts or overly optimistic assumptions, decision-makers can consider the full range of potential scenarios and their probabilities when making plans and resource allocations. Probabilistic forecasting enables more informed and robust decision-making by accounting for the inherent variability and unpredictability of complex systems and environments. Techniques such as Monte Carlo simulation, Bayesian inference, and sensitivity analysis can facilitate the application of probabilistic forecasting in various domains and support risk-informed decision-making. By adopting probabilistic forecasting methods, organizations can improve the accuracy of their forecasts, enhance risk management practices, and mitigate the negative effects of the planning fallacy.Assessing the range of possible outcomes and associated uncertainties by incorporating probabilistic thinking into decision-making processes, in forecasting, risk management, or resource allocation contexts where decision-makers aim to make informed and robust decisions under uncertainty, in implementing techniques or methodologies that facilitate the application of probabilistic forecasting, such as Monte Carlo simulation, Bayesian inference, or sensitivity analysis, in adopting strategies or approaches that promote risk-informed decision-making by integrating probabilistic forecasting into planning and resource allocation processes and enhancing organizational resilience to uncertainty.
Black Swan TheoryBlack Swan Theory: The black swan theory, proposed by Nassim Nicholas Taleb, describes highly improbable events that have a significant impact and are often mistakenly rationalized or explained after the fact. Black swan events challenge traditional forecasting methods and highlight the limitations of probabilistic thinking in predicting rare and unpredictable events. The planning fallacy is susceptible to black swan events, as it tends to underestimate the likelihood and impact of extreme outliers and unforeseen disruptions. Recognizing the potential for black swan events can help decision-makers avoid overreliance on historical data and probabilistic forecasts and adopt more robust risk management practices. Instead of assuming that the future will resemble the past, decision-makers can embrace uncertainty and develop strategies that are resilient to extreme events and systemic shocks. Black swan thinking encourages organizations to build adaptive capacity, diversify their risk exposure, and maintain a margin of safety to withstand unexpected contingencies. By incorporating black swan awareness into their planning processes, organizations can improve their preparedness for rare but consequential events and reduce their vulnerability to the planning fallacy.Avoiding overreliance on historical data and probabilistic forecasts by recognizing the potential for black swan events and adopting more robust risk management practices, in strategic planning, investment analysis, or crisis preparedness contexts where organizations aim to anticipate and mitigate the impact of rare and unpredictable events, in developing strategies or approaches that build adaptive capacity and diversify risk exposure to withstand extreme events and systemic shocks, in integrating black swan awareness into planning processes and organizational culture to improve preparedness and reduce vulnerability to the planning fallacy.
Complexity TheoryComplexity Theory: Complexity theory examines the behavior of complex systems composed of numerous interconnected components that exhibit emergent properties and nonlinear dynamics. It recognizes that simple cause-and-effect relationships may not adequately capture the behavior of complex systems and that outcomes may be influenced by multiple interacting factors and feedback loops. Complexity theory addresses the planning fallacy by acknowledging the inherent complexity and uncertainty of social, economic, and environmental systems. Instead of attempting to predict and control every aspect of a system, decision-makers can embrace complexity and focus on building adaptive capacity and resilience. Complexity thinking encourages iterative experimentation, decentralized decision-making, and adaptive governance structures that can better respond to dynamic and unpredictable environments. By applying complexity theory principles, organizations can develop more flexible and robust strategies that are better suited to navigating complexity and uncertainty. Techniques such as systems thinking, network analysis, and agent-based modeling can facilitate the application of complexity theory in various domains and support more effective decision-making in the face of the planning fallacy.Embracing complexity and uncertainty by applying complexity theory principles to develop flexible and robust strategies, in strategic planning, policy development, or organizational management contexts where decision-makers aim to navigate dynamic and unpredictable environments, in implementing techniques or methodologies that facilitate the application of complexity theory, such as systems thinking, network analysis, or agent-based modeling, in adopting strategies or approaches that promote adaptive capacity and resilience by embracing complexity and fostering decentralized decision-making and iterative experimentation to respond to evolving challenges and opportunities.
Adaptive LeadershipAdaptive Leadership: Adaptive leadership is a leadership approach that emphasizes the ability to respond effectively to changing circumstances and complex challenges. It involves diagnosing the adaptive challenges facing an organization or community, mobilizing collective action, and facilitating learning and innovation. Adaptive leadership addresses the planning fallacy by promoting a proactive and agile approach to leadership that is responsive to uncertainty and ambiguity. Instead of relying on top-down directives and rigid plans, adaptive leaders encourage experimentation, collaboration, and continuous adaptation to emergent threats and opportunities. Adaptive leadership fosters resilience and agility by empowering stakeholders to co-create solutions, challenge entrenched norms, and navigate adaptive tensions. By embracing adaptive leadership principles, organizations can build capacity for change and transformation, enabling them to thrive in turbulent environments characterized by the planning fallacy. Adaptive leadership techniques such as sensemaking, distributed decision-making, and adaptive coaching can support leaders in effectively guiding their organizations through uncertainty and complexity.Responding effectively to changing circumstances and complex challenges by embracing adaptive leadership principles, in organizational leadership, change management, or community development contexts where leaders aim to navigate uncertainty and foster resilience, in implementing techniques or methodologies that support adaptive leadership, such as sensemaking, distributed decision-making, or adaptive coaching, in adopting strategies or approaches that promote proactive and agile leadership by empowering stakeholders to co-create solutions and navigate adaptive tensions in response to emergent threats and opportunities.
Resilience EngineeringResilience Engineering: Resilience engineering is an interdisciplinary approach to safety and risk management that focuses on building adaptive capacity and system robustness in complex socio-technical systems. It emphasizes understanding how systems respond to and recover from disruptions, rather than solely focusing on preventing failures. Resilience engineering addresses the planning fallacy by recognizing that failures and accidents are inevitable in complex systems and that the focus should be on reducing their frequency, severity, and impact. Instead of striving for perfection, resilience engineering seeks to enhance the ability of systems to absorb disturbances, adapt to changing conditions, and maintain functionality under stress. Resilience engineering principles such as redundancy, diversity, and modularity can help organizations build more resilient systems and cultures that can withstand unexpected shocks and disturbances. By integrating resilience engineering into safety management and organizational practices, organizations can enhance their ability to cope with the planning fallacy and other sources of uncertainty and variability.Building adaptive capacity and system robustness in complex socio-technical systems by integrating resilience engineering principles, in safety management, risk assessment, or organizational design contexts where organizations aim to enhance their ability to cope with uncertainty and variability, in implementing techniques or methodologies that support resilience engineering, such as redundancy, diversity, or modularity, in adopting strategies or approaches that promote a culture of resilience and continuous improvement by embracing the principles of resilience engineering and prioritizing adaptive capacity and system robustness in organizational practices and decision-making processes.

Connected Thinking Frameworks

Convergent vs. Divergent Thinking

convergent-vs-divergent-thinking
Convergent thinking occurs when the solution to a problem can be found by applying established rules and logical reasoning. Whereas divergent thinking is an unstructured problem-solving method where participants are encouraged to develop many innovative ideas or solutions to a given problem. Where convergent thinking might work for larger, mature organizations where divergent thinking is more suited for startups and innovative companies.

Critical Thinking

critical-thinking
Critical thinking involves analyzing observations, facts, evidence, and arguments to form a judgment about what someone reads, hears, says, or writes.

Biases

biases
The concept of cognitive biases was introduced and popularized by the work of Amos Tversky and Daniel Kahneman in 1972. Biases are seen as systematic errors and flaws that make humans deviate from the standards of rationality, thus making us inept at making good decisions under uncertainty.

Second-Order Thinking

second-order-thinking
Second-order thinking is a means of assessing the implications of our decisions by considering future consequences. Second-order thinking is a mental model that considers all future possibilities. It encourages individuals to think outside of the box so that they can prepare for every and eventuality. It also discourages the tendency for individuals to default to the most obvious choice.

Lateral Thinking

lateral-thinking
Lateral thinking is a business strategy that involves approaching a problem from a different direction. The strategy attempts to remove traditionally formulaic and routine approaches to problem-solving by advocating creative thinking, therefore finding unconventional ways to solve a known problem. This sort of non-linear approach to problem-solving, can at times, create a big impact.

Bounded Rationality

bounded-rationality
Bounded rationality is a concept attributed to Herbert Simon, an economist and political scientist interested in decision-making and how we make decisions in the real world. In fact, he believed that rather than optimizing (which was the mainstream view in the past decades) humans follow what he called satisficing.

Dunning-Kruger Effect

dunning-kruger-effect
The Dunning-Kruger effect describes a cognitive bias where people with low ability in a task overestimate their ability to perform that task well. Consumers or businesses that do not possess the requisite knowledge make bad decisions. What’s more, knowledge gaps prevent the person or business from seeing their mistakes.

Occam’s Razor

occams-razor
Occam’s Razor states that one should not increase (beyond reason) the number of entities required to explain anything. All things being equal, the simplest solution is often the best one. The principle is attributed to 14th-century English theologian William of Ockham.

Lindy Effect

lindy-effect
The Lindy Effect is a theory about the ageing of non-perishable things, like technology or ideas. Popularized by author Nicholas Nassim Taleb, the Lindy Effect states that non-perishable things like technology age – linearly – in reverse. Therefore, the older an idea or a technology, the same will be its life expectancy.

Antifragility

antifragility
Antifragility was first coined as a term by author, and options trader Nassim Nicholas Taleb. Antifragility is a characteristic of systems that thrive as a result of stressors, volatility, and randomness. Therefore, Antifragile is the opposite of fragile. Where a fragile thing breaks up to volatility; a robust thing resists volatility. An antifragile thing gets stronger from volatility (provided the level of stressors and randomness doesn’t pass a certain threshold).

Systems Thinking

systems-thinking
Systems thinking is a holistic means of investigating the factors and interactions that could contribute to a potential outcome. It is about thinking non-linearly, and understanding the second-order consequences of actions and input into the system.

Vertical Thinking

vertical-thinking
Vertical thinking, on the other hand, is a problem-solving approach that favors a selective, analytical, structured, and sequential mindset. The focus of vertical thinking is to arrive at a reasoned, defined solution.

Maslow’s Hammer

einstellung-effect
Maslow’s Hammer, otherwise known as the law of the instrument or the Einstellung effect, is a cognitive bias causing an over-reliance on a familiar tool. This can be expressed as the tendency to overuse a known tool (perhaps a hammer) to solve issues that might require a different tool. This problem is persistent in the business world where perhaps known tools or frameworks might be used in the wrong context (like business plans used as planning tools instead of only investors’ pitches).

Peter Principle

peter-principle
The Peter Principle was first described by Canadian sociologist Lawrence J. Peter in his 1969 book The Peter Principle. The Peter Principle states that people are continually promoted within an organization until they reach their level of incompetence.

Straw Man Fallacy

straw-man-fallacy
The straw man fallacy describes an argument that misrepresents an opponent’s stance to make rebuttal more convenient. The straw man fallacy is a type of informal logical fallacy, defined as a flaw in the structure of an argument that renders it invalid.

Streisand Effect

streisand-effect
The Streisand Effect is a paradoxical phenomenon where the act of suppressing information to reduce visibility causes it to become more visible. In 2003, Streisand attempted to suppress aerial photographs of her Californian home by suing photographer Kenneth Adelman for an invasion of privacy. Adelman, who Streisand assumed was paparazzi, was instead taking photographs to document and study coastal erosion. In her quest for more privacy, Streisand’s efforts had the opposite effect.

Heuristic

heuristic
As highlighted by German psychologist Gerd Gigerenzer in the paper “Heuristic Decision Making,” the term heuristic is of Greek origin, meaning “serving to find out or discover.” More precisely, a heuristic is a fast and accurate way to make decisions in the real world, which is driven by uncertainty.

Recognition Heuristic

recognition-heuristic
The recognition heuristic is a psychological model of judgment and decision making. It is part of a suite of simple and economical heuristics proposed by psychologists Daniel Goldstein and Gerd Gigerenzer. The recognition heuristic argues that inferences are made about an object based on whether it is recognized or not.

Representativeness Heuristic

representativeness-heuristic
The representativeness heuristic was first described by psychologists Daniel Kahneman and Amos Tversky. The representativeness heuristic judges the probability of an event according to the degree to which that event resembles a broader class. When queried, most will choose the first option because the description of John matches the stereotype we may hold for an archaeologist.

Take-The-Best Heuristic

take-the-best-heuristic
The take-the-best heuristic is a decision-making shortcut that helps an individual choose between several alternatives. The take-the-best (TTB) heuristic decides between two or more alternatives based on a single good attribute, otherwise known as a cue. In the process, less desirable attributes are ignored.

Bundling Bias

bundling-bias
The bundling bias is a cognitive bias in e-commerce where a consumer tends not to use all of the products bought as a group, or bundle. Bundling occurs when individual products or services are sold together as a bundle. Common examples are tickets and experiences. The bundling bias dictates that consumers are less likely to use each item in the bundle. This means that the value of the bundle and indeed the value of each item in the bundle is decreased.

Barnum Effect

barnum-effect
The Barnum Effect is a cognitive bias where individuals believe that generic information – which applies to most people – is specifically tailored for themselves.

First-Principles Thinking

first-principles-thinking
First-principles thinking – sometimes called reasoning from first principles – is used to reverse-engineer complex problems and encourage creativity. It involves breaking down problems into basic elements and reassembling them from the ground up. Elon Musk is among the strongest proponents of this way of thinking.

Ladder Of Inference

ladder-of-inference
The ladder of inference is a conscious or subconscious thinking process where an individual moves from a fact to a decision or action. The ladder of inference was created by academic Chris Argyris to illustrate how people form and then use mental models to make decisions.

Goodhart’s Law

goodharts-law
Goodhart’s Law is named after British monetary policy theorist and economist Charles Goodhart. Speaking at a conference in Sydney in 1975, Goodhart said that “any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.” Goodhart’s Law states that when a measure becomes a target, it ceases to be a good measure.

Six Thinking Hats Model

six-thinking-hats-model
The Six Thinking Hats model was created by psychologist Edward de Bono in 1986, who noted that personality type was a key driver of how people approached problem-solving. For example, optimists view situations differently from pessimists. Analytical individuals may generate ideas that a more emotional person would not, and vice versa.

Mandela Effect

mandela-effect
The Mandela effect is a phenomenon where a large group of people remembers an event differently from how it occurred. The Mandela effect was first described in relation to Fiona Broome, who believed that former South African President Nelson Mandela died in prison during the 1980s. While Mandela was released from prison in 1990 and died 23 years later, Broome remembered news coverage of his death in prison and even a speech from his widow. Of course, neither event occurred in reality. But Broome was later to discover that she was not the only one with the same recollection of events.

Crowding-Out Effect

crowding-out-effect
The crowding-out effect occurs when public sector spending reduces spending in the private sector.

Bandwagon Effect

bandwagon-effect
The bandwagon effect tells us that the more a belief or idea has been adopted by more people within a group, the more the individual adoption of that idea might increase within the same group. This is the psychological effect that leads to herd mentality. What in marketing can be associated with social proof.

Moore’s Law

moores-law
Moore’s law states that the number of transistors on a microchip doubles approximately every two years. This observation was made by Intel co-founder Gordon Moore in 1965 and it become a guiding principle for the semiconductor industry and has had far-reaching implications for technology as a whole.

Disruptive Innovation

disruptive-innovation
Disruptive innovation as a term was first described by Clayton M. Christensen, an American academic and business consultant whom The Economist called “the most influential management thinker of his time.” Disruptive innovation describes the process by which a product or service takes hold at the bottom of a market and eventually displaces established competitors, products, firms, or alliances.

Value Migration

value-migration
Value migration was first described by author Adrian Slywotzky in his 1996 book Value Migration – How to Think Several Moves Ahead of the Competition. Value migration is the transferal of value-creating forces from outdated business models to something better able to satisfy consumer demands.

Bye-Now Effect

bye-now-effect
The bye-now effect describes the tendency for consumers to think of the word “buy” when they read the word “bye”. In a study that tracked diners at a name-your-own-price restaurant, each diner was asked to read one of two phrases before ordering their meal. The first phrase, “so long”, resulted in diners paying an average of $32 per meal. But when diners recited the phrase “bye bye” before ordering, the average price per meal rose to $45.

Groupthink

groupthink
Groupthink occurs when well-intentioned individuals make non-optimal or irrational decisions based on a belief that dissent is impossible or on a motivation to conform. Groupthink occurs when members of a group reach a consensus without critical reasoning or evaluation of the alternatives and their consequences.

Stereotyping

stereotyping
A stereotype is a fixed and over-generalized belief about a particular group or class of people. These beliefs are based on the false assumption that certain characteristics are common to every individual residing in that group. Many stereotypes have a long and sometimes controversial history and are a direct consequence of various political, social, or economic events. Stereotyping is the process of making assumptions about a person or group of people based on various attributes, including gender, race, religion, or physical traits.

Murphy’s Law

murphys-law
Murphy’s Law states that if anything can go wrong, it will go wrong. Murphy’s Law was named after aerospace engineer Edward A. Murphy. During his time working at Edwards Air Force Base in 1949, Murphy cursed a technician who had improperly wired an electrical component and said, “If there is any way to do it wrong, he’ll find it.”

Law of Unintended Consequences

law-of-unintended-consequences
The law of unintended consequences was first mentioned by British philosopher John Locke when writing to parliament about the unintended effects of interest rate rises. However, it was popularized in 1936 by American sociologist Robert K. Merton who looked at unexpected, unanticipated, and unintended consequences and their impact on society.

Fundamental Attribution Error

fundamental-attribution-error
Fundamental attribution error is a bias people display when judging the behavior of others. The tendency is to over-emphasize personal characteristics and under-emphasize environmental and situational factors.

Outcome Bias

outcome-bias
Outcome bias describes a tendency to evaluate a decision based on its outcome and not on the process by which the decision was reached. In other words, the quality of a decision is only determined once the outcome is known. Outcome bias occurs when a decision is based on the outcome of previous events without regard for how those events developed.

Hindsight Bias

hindsight-bias
Hindsight bias is the tendency for people to perceive past events as more predictable than they actually were. The result of a presidential election, for example, seems more obvious when the winner is announced. The same can also be said for the avid sports fan who predicted the correct outcome of a match regardless of whether their team won or lost. Hindsight bias, therefore, is the tendency for an individual to convince themselves that they accurately predicted an event before it happened.

Read Next: BiasesBounded RationalityMandela EffectDunning-Kruger EffectLindy EffectCrowding Out EffectBandwagon Effect.

Main Guides:

Scroll to Top

Discover more from FourWeekMBA

Subscribe now to keep reading and get access to the full archive.

Continue reading

FourWeekMBA