bounded-rationality

What Is Bounded Rationality And Why It Matters

Bounded rationality is a concept attributed to Herbert Simon, an economist and political scientist interested in decision-making and how we make decisions in the real world. In fact, he believed that rather than optimizing (which was the mainstream view in the past decades) humans follow what he called satisficing.

AspectExplanation
Bounded RationalityBounded rationality is a concept in behavioral economics and decision-making theory that recognizes the limitations of human cognitive abilities when making decisions. It suggests that people often simplify complex decisions and use heuristics (mental shortcuts) due to cognitive constraints such as time, information, and cognitive capacity.
Herbert A. SimonThe term “bounded rationality” was introduced by Nobel laureate Herbert A. Simon in the 1950s. Simon argued that in real-world decision-making, individuals cannot always make fully rational choices as described in classical economic models, which assume perfect information and unlimited cognitive capabilities. Instead, people operate within their bounded rationality.
SatisficingOne of the key concepts of bounded rationality is satisficing, a portmanteau of “satisfy” and “suffice.” Satisficing means that individuals often seek solutions or make choices that are good enough or satisfactory, rather than trying to find the optimal or perfect solution. This helps save time and mental effort.
Limited InformationBounded rationality acknowledges that individuals have limited access to information and may not be aware of all available options or consequences when making decisions. As a result, they rely on partial information and make judgments based on what they know.
HeuristicsTo cope with complex decisions, people use heuristics, which are mental shortcuts or rules of thumb. Heuristics simplify decision-making by allowing individuals to quickly assess and choose among options. Examples include the availability heuristic, where people rely on readily available information, and the anchoring and adjustment heuristic, where they start from a reference point and adjust.
Bounded WillpowerBounded rationality also extends to self-control and willpower. Individuals may have limited willpower to resist temptations or make disciplined choices. This concept is relevant in behavioral economics, where understanding the limits of self-control is crucial in designing strategies for personal and societal improvement.
Biases and ErrorsBounded rationality can lead to cognitive biases and decision-making errors. Individuals may fall prey to confirmation bias, overconfidence, and other biases when processing information or making judgments. These biases can result in suboptimal decisions.
Behavioral EconomicsBounded rationality is a central concept in behavioral economics, a field that combines insights from psychology and economics to study how individuals deviate from perfectly rational behavior. Behavioral economics helps explain real-world decisions and has implications for policy, marketing, and finance.
Adaptive StrategiesWhile bounded rationality acknowledges human limitations, it also recognizes that individuals develop adaptive strategies to make reasonably good decisions in their specific environments. These strategies may vary depending on the context and the available resources.
ImplicationsUnderstanding bounded rationality has implications for designing decision support systems, public policies, and marketing strategies. It emphasizes the importance of providing clear and simple information, reducing cognitive load, and aligning choices with individuals’ cognitive capabilities. It also highlights the need for nudges and interventions to improve decision-making.
Key TakeBounded rationality is a concept that recognizes the inherent cognitive limitations of individuals when making decisions. It highlights the importance of simplicity, heuristics, and adaptability in navigating complex choices and has significant applications in the fields of economics, psychology, and decision science.

A quick intro to bounded rationality

Many models, especially in economic theory and social sciences still rely on unbounded rationality to make predictions about human behavior. Those models have proved wholly ineffective, and they do not reflect the real world.

In the last decade cognitive theories that look at humans as a bunch of flawed beings that due to their biological limitations commit a series of errors (the so-called biases) has taken over.

I supported this theory on this blog. However, what might seem biased, at a more in-depth look are in reality unconscious rationality (what we call gut feelings) that helps us survive in the real (uncertain) world.

Bounded rationality is a framework that proves way more robust – I argue than any other. That is why it makes sense to look at it to understand what bounded rationality really means.

Bounded rationality – more than a theory is a warning to economists and social scientists – that can be summarised as the study of how people make decisions in an uncertain world. As pointed out by Greg Gigerenzer, there are at least three meanings attributed to unbounded rationality:

  • optimization: there are constraints in the outside world that don’t allow us to get all the data available
  • biases and errors: there are constraints in our memory and cognitive limitations that limit our decision-making ability
  • bounded rationality: how do people make decisions when optimization is out of reach.

The first two don’t admit the existence of an uncertain world. Why? When you study decision-making under risk, the assumption is that we live in a certain world, where given all the data available we can compute that risk.

What economists like to call optimization under constraints. This is true only in a small world, where everything can be calculated.

The second assumes that due to our limited cognitive abilities we deviate from solving problems accurately, thus we fall into biases and cognitive errors.

While the first emphasizes on rationality, the second focuses on irrationality.

The third concept, which is what bounded rationality really is about was elaborated by Herbert Simon.

He asked the question, “how do people make decisions when optimization is out of reach?” In short, how do people make decisions in an uncertain world?

There are a few things to take into account when thinking about bounded rationality:

We don’t live in a small world

In a small world, given enough data, we can compute the consequence of many actions and behaviors.

In the real world, risk cannot be known or modeled

In many disciplines, especially economics and finance at the academic level, the assessment of risk is central.

However, what we cal risk implies something that can be computed. In fact, in the financial toolbox, there are many measures of risks.

However, those are often worthless, since they start from the assumption that given enough data you can put a precise number on the risk you’re undertaking.

However, that is not the case. In the real world, there are hidden variables that can never be taken into account, even if you have zillions of data

Optimization is not bounded rationality

Many confuse optimization for bounded rationality. They are opposite concepts. Optimization starts from the assumption that we live in a small world where you can compute risk.

Bounded rationality starts from the assumption that we live in an uncertain world where we can’t assess risk. That is why we have a toolset of heuristics that work more accurately than complicated models in the real world

Biases are not errors but heuristics that work in most cases to make us avoid screw-ups

In short, heuristics rather than being shortcuts that are fast but inaccurate. Those are instead quick, effective, and in most cases, more accurate than other forms of decision-making (in the real world)

Satisficing: Look at the one good reason

In an uncertain world in many cases, ignore all the information and look at the one good reason to make a decision that works best.

Survival is rationality in the real world

Put in this form rationality is not a matter of beautiful mathematical models, but it is about survival. What survives might be then called rational.

Kahneman’s error

The whole behavioral school of thought today is mostly based on Kahneman’s and Tversky’s work on heuristics and biases. 

Kahneman and Tversky are two pillars of modern behavioral economics, and indeed those that most of all have influenced policies in the field. 

There is a core issue underlying the Kahneman and Tversky definition of bias and heuristics. 

Where in the world of Herbert Simon, heuristics are seen as a very effective shortcut (actually working much better than other more complex models of the real world) that help humans successfully deal with the context in which they are. 

In Kahneman’s view, heuristics mostly lead to biases or errors of understanding of the real world. 

This negative view of human psychology has led Kahneman to formulate a whole bunch of biases or errors that humans supposedly make. Still, as it turned out to be, rather than being errors, the definition of real-world from these academics turned out to be wrong. 

In other words, most experiments led to a wide list of psychological errors, almost as if a human is a collection of a bunch of misconceptions about the real world; it turned out those experiments were manufacturing a fake context, which does not exist in the real world. 

For instance, if you take a bias like loss aversion used as one of the many examples of human biases, you realize that this has been tested as if humans had unlimited ability to take losses. 

Instead, more contextual models of the world, like ergodicity, show us that humans are highly contextual creatures (this is what Herbert Simon meant with bounded rationality) acting according to the fact that we do not have unlimited lives.

This simple fact got missed from most behavioral psychology research of the last two decades and it led to a whole bunch of mistakes. 

Source: Nassim Nicholas Taleb at The Logic of Risk Taking

As you can see above, we live in a world where each of us is constrained by time probability. Meaning if you take too many risks, you go broke, and that will affect your whole life. 

Instead, behavioral psychologists, when testing some human biases, tested them as if, each of us had ensemble probability (in short, there was no time dependence), as if we were in a simulated world with many lives. 

That turned into a major crisis in behavioral economics, in which foundations have been shaken by the fact that most of these experiments could not be replicated. 

And the whole school of thought of heuristics as the primary avenue to biases and of the nudging school (you can influence people to do things by leveraging those biases) has been shaken to its foundation. 

Bounded rationality and Artificial Intelligence

We’re making the same mistake now, with the development of new technologies, like artificial intelligence. 

Also, here, many academics and practitioners in the field act as if a human is just a set of tasks, not considering that there are many more facets of being a human that science doesn’t grasp yet (or perhaps might never grasp).

This leads to a dystopian view of the world, where AI can take over humans any time soon and a world where Artificial General Intelligence is possible. 

Instead, it’s critical to recognize the huge limitations that AI has, as of now, the fact that it’s not conscious at all. And AI works in a completely different way than humans. 

Where humans can adapt to many contexts which are ambiguous and noisy and where there is extremely conflicting information about what the problem at hand is. 

The AI thrives, instead, in a narrow context, highly controlled, where we give it a clear definition of the problem. 

If we realize that, we can move to a human-in-the-loop AI approach, where humans can focus on designing the proper context for AI to thrive. 

But it’s the human that defines what problems are worth solving, what context it makes sense to have the AI operate within, and sets the boundaries and guardrails for that. 

That’s a critical point to take into account for the future development of AI, as otherwise, the risk, is putting too much confidence into machines which will leave us awry. 

Bounded rationality explained

Books to read to enhance your bounded rationality

With technological advancements, there is more and more available information at a cheaper cost (actually information nowadays is free). Also, technology also gives us the impression that we live in a world that we can control.

All it takes is enough information and we’ll be able to be successful in business. That is why you need to have the latest news, the newest gadget, and follow the latest trend.

This kind of approach can live you astray! As you get access to more and more information, this also improves the noise exponentially.

Thus, rather than getting better at making decisions you become way worst. With an even worse consequence: you’re not aware of that. The fact that you have a lot of data makes you believe that you know best.

Therefore, I believe there are three aspects to take into account in the modern, seemingly fast-changing world:

  • have at your disposal a simple yet effective toolset for decision-making in the real world
  • develop the ability to ignore information that isn’t needed
  • know when to trust your gut feelings rather than relying on complex models

In this respect, three books can help you with that. Two books are from Gerd Gigerenzer, a German psychologist who has studied bounded rationality and heuristics in decision making. The third is from Nicholas Nassim Taleb, author of The Black Swan and the Incerto Book Series. 

Risk Savvy: How to Make Good Decisions

In the past century, the leap forward for humanity was to teach to most of us how to read and write. If that was enough in a modern world where information was still scarce.

Nowadays with the advent of social media and the increasing speed of the internet, there is another tool that anyone has to master to survive: statistical thinking.

Risk Savvy helps you build the toolbox to become a better statistical thinker. Or to ask better questions that allow you to navigate through the noise of the modern world:

risk-savvy-book

Gut Feelings: The Intelligence of the Unconscious

This book is an excellent introduction to the concept of bounded rationality and heuristics. It is also a fresh perspective on decision-making. Where current prevailing cognitive psychological theories focus on our biases and cognitive errors, this book focuses on why instead those heuristics make a lot of sense.

In fact, gut feelings are seen quite skeptically in the world of academia and corporations where big words are looked with more respect. This book shows you why gut feeling matters in business as in life:

gut-feelings-book

Skin in the Game: Hidden Asymmetries in Daily Life

Skin in the Game is the bible for understanding how to get along in a world that is plenty of hidden asymmetries:

Skin in the Game: Hidden Asymmetries in Daily Life

Bounded Rationality Examples in Business

Jeff Bezos is one of the business people that throughout his career as an entrepreneur in building Amazon from scratch, has leveraged various mental frameworks very close to the concept of bounded rationality.

Indeed, he understood the difference between linear and non-linear thinking and how intuition, driven by bounded rationality, could be used to create breakthroughs for Amazon.

Let’s explore some of these examples.

Regret Minimization Framework

regret-minimization-framework
A regret minimization framework is a business heuristic that enables you to make a decision, by projecting yourself in the future, at an old age, and visualize whether the regrets of missing an opportunity would hunt you down, vs. having taken the opportunity and failed. In short, if taking action and failing feels much better than regretting it, in the long run, that is when you’re ready to go!

As the story goes, when Jeff Bezos had to decide whether to leave his well-paid job and consolidated position on Wall Street to start a venture on the nascent Internet, he didn’t use spreadsheets or complicated mental equations.

Quite the opposite, he cut through the noise by using a mental model which is called regret minimization.

In short, he imagined himself as an old man at the end of his career and how he would have looked at his life back in the grand scheme of things.

And with that visualization, he imagined he would have regretted not having tried to start what would later become Amazon.

The regret minimization framework is extremely powerful because it is a via negativa framework. In other words, it tries to avoid having major regrets by taking a long-term vision.

Indeed, balancing long-term with short-term decision-making is probably one of the most complex human endeavors.

Day One Mindset

jeff-bezos-day-1
In a letter to shareholders in 2016, Jeff Bezos addressed a topic he had been thinking about quite profoundly in the last decades as he led Amazon: Day 1. As Jeff Bezos put it, “Day 2 is stasis. Followed by irrelevance. Followed by an excruciating, painful decline. Followed by death. And that is why it is always Day 1.”

Another mental framework, in the bounded rationality domain, was the use of the “day one” mindset within Amazon.

This is another bounded rationality approach because it helps cut through the decision-making process in very uncertain times.

When Amazon finds itself in an important turn of events, it determines how the company will look in the long term.

Day One helped the company be on track to its long-term vision. The Day One mindset is about keeping a startup mindset as the company grows.

Customer Obsession

customer-obsession
In the Amazon Shareholders’ Letter for 2018, Jeff Bezos analyzed the Amazon business model, and it also focused on a few key lessons that Amazon as a company has learned over the years. These lessons are fundamental for any entrepreneur of a small or large organization to understand the pitfalls to avoid running a successful company!

Customer obsession has been the driving principle of Amazon since the onset. In a tech company like Amazon, which leveraged data to improve its operations.

Customer obsession helped the company keep its feet on the ground, thus always returning to the bottom-up innovation approach, where you got to focus on customers to build a successful business.

This bounded rationality mental model critically helped Amazon maintain its focus while scaling up.

Working Backwards

working-backwards
The Amazon Working Backwards Method is a product development methodology that advocates building a product based on customer needs. The Amazon Working Backwards Method gained traction after notable Amazon employee Ian McAllister shared the company’s product development approach on Quora. McAllister noted that the method seeks “to work backwards from the customer, rather than starting with an idea for a product and trying to bolt customers onto it.”

The Working Backwards Method has been critical at Amazon as a product development methodology where you focus on the customer needs.

In a tech-driven world, it’s very easy to fall into the “innovator’s bias” or the trap of considering the technical solution as the priority over solving a concrete business need.

A backward working framework does exactly that. It helps simplify the development process of a product with the customer in mind.

The Flywheel

amazon-flywheel
The Amazon Flywheel or Amazon Virtuous Cycle is a strategy that leverages on customer experience to drive traffic to the platform and third-party sellers. That improves the selection of goods, and Amazon further improves its cost structure so it can decrease prices which spins the flywheel.

In a digital world driven by network effects, moving from sales funnels to flywheels has been another critical shift in mindset.

Amazon has led the way there.

The Amazon Flywheel was a mental model where Amazon could build momentum into its business by enabling the compounding of growth over time as it kept building demand for its products.

This is one of the most powerful mental models. of the digital business world.

Bounded Rationality: the 2008 financial crisis case study

Countrywide Financial

In the early 2000s, the US housing market experienced a period of rapid growth and increased demand for mortgage-backed security (MBS) investments. 

One company that capitalized on the trend was Countrywide Financial, a lender that originated and sold thousands of these securities to investors.

Countrywide was also a stock market darling with a 23,000% increase in share price between 1982 and 2003 the best of any Fortune 500 firm over that period.

But in the years preceding the 2008 GFC, Countrywide’s lending practices became increasingly risky and aggressive.

Executives ignored warning signs about the housing market and continued to approve loans to borrowers who could not meet their repayments.

Sub-prime mortgages

The company’s business model was built on the approval of subprime mortgages to borrowers who did not meet the traditional criteria for obtaining a loan. 

Lending practices were based on the assumption that house prices would continue to rise, which, in theory, would enable borrowers to refinance or sell their homes before their mortgages reset to higher interest rates. 

This strategy was based on the heuristic that housing prices would always increase, and therefore, borrowers would always have the means to repay their loans. 

However, Countrywide Financial did not consider a downturn in the housing market or indeed the potential that borrowers would default on their loans.

Countrywide’s lack of consideration for potential risks demonstrates bounded rationality. The company’s decision-making was hampered by its incomplete understanding of the market and reliance on heuristics rather than a detailed analysis of potential outcomes.

Meanwhile, and in a similar vein, investors continued to pay more for MBSs than was warranted. They also failed to properly quantify risk and instead relied on triple-A credit ratings.

Reckless lending practices

The company’s lending practices were influenced by competitive pressures to increase market share and the incentive to generate higher profits through the securitization of mortgages. 

Countrywide increased its commissioned sales force by 60% in 2003 to facilitate this growth at all costs mindset. Employees were incentivized to undertake riskier transactions, and since there was no limit to the commissions on offer, many staff became millionaires.

Borrowers were approved for loans with a 0% deposit, while others were approved without their income properly verified. In the pursuit of profit, sub-prime loans were also offered to consumers who easily qualified for prime loans.

In a later interview, former Countrywide president David Sambol described the extent of the company’s predatory tactics: “We had reached a point where the question was, ‘What will we do next – pay borrowers to take loans?'”

These external pressures further constrained the company’s decision-making and limited its ability to consider the potential long-term consequences of its lending practices.

Upper management

In addition to their carte blanche approval of the company’s lending practices, upper management was also incentivized to make decisions based on short-term performance metrics. 

That is, executives were motivated by loan volume and profitability and not by the quality of the loans and the long-term sustainability of Countrywide’s business model

The focus on short-term gains further constrained the company’s decision-making and limited its ability to make rational decisions that may have enabled it to survive the GFC.

Case Studies

  • Marketing and Consumer Behavior: Marketers often rely on consumers’ bounded rationality. For instance, they use tactics like decoy pricing, where an additional option is presented to make another option seem more appealing. They also understand that consumers often use heuristics or mental shortcuts (e.g., brand reputation) to make purchasing decisions, especially when overwhelmed with choices.
    • Example: A consumer might buy a popular branded soap, not because they’ve analyzed all soap ingredients and benefits, but because they trust the brand.
  • Organizational Decision-Making: Companies can’t analyze all available data for every decision due to time and resource constraints. They often rely on rules of thumb, past experiences, and simplified models to make decisions.
    • Example: A company might decide to enter a new market not because they’ve analyzed every potential outcome, but because it aligns with their broader strategic goals and past successes.
  • Product Design and User Experience: Designers understand that users don’t always make optimal choices when interacting with products or websites. They use principles of bounded rationality to design intuitive interfaces that guide user behavior.
    • Example: A software application might have a default setting that suits most users, knowing that many won’t take the time to explore all available options.
  • Strategic Planning: When formulating strategy, businesses have to make assumptions about the future, which is inherently uncertain. They make educated guesses based on available information and past trends, even though they can’t predict the future with certainty.
    • Example: A company might decide to invest in renewable energy technology based on current trends in regulations and public sentiment, even if the long-term profitability of that decision is uncertain.
  • Negotiations: In business negotiations, parties often have to make decisions with incomplete information. They might use heuristics, such as focusing on a few key points or basing decisions on past negotiations, to arrive at an agreement.
    • Example: In a business deal, a company might accept terms that are “good enough” rather than holding out for the best possible deal to ensure a timely agreement.
  • Hiring and Talent Management: HR professionals and managers don’t have perfect information when making hiring or promotion decisions. They rely on CVs, interviews, and references, all of which provide incomplete pictures of a candidate’s potential.
    • Example: An employer might hire a candidate based on a strong reference from a trusted colleague, even if the candidate’s interview performance was not optimal.
  • Supply Chain and Logistics: Businesses must make decisions about inventory, distribution, and production with incomplete data about future demand, potential disruptions, and other factors.
    • Example: A retailer might stock up on umbrellas based on a weather forecast, even if they can’t be sure when or how much rain will come.

Key Highlights

  • Bounded Rationality Concept:
    • Recognizes limitations in human decision-making.
    • Challenges traditional unbounded rationality assumptions.
    • Acknowledges cognitive limitations and reliance on heuristics.
  • Satisficing:
    • Coined by Herbert Simon.
    • Refers to choosing “good enough” solutions over optimal ones.
    • Aims for practical and manageable outcomes rather than exhaustive searches.
  • Unbounded Rationality Perspectives:
    • Optimization under Constraints:
      • External limitations prevent gathering all data for optimal decisions.
    • Biases and Cognitive Errors:
      • Cognitive processes are imperfect and prone to biases.
      • Biases may be adaptive shortcuts in coping with complexity.
    • Bounded Rationality in an Uncertain World:
      • Questions decision-making in unpredictable environments.
  • Decision-Making Challenges:
    • Real-world uncertainty and hidden variables.
    • Risk cannot always be precisely known or modeled.
    • Ambiguous contexts and conflicting information.
  • Heuristics:
    • Effective mental shortcuts.
    • Aid decision-making under cognitive constraints and uncertainty.
    • Lead to quick, effective, and accurate choices.
  • Rationality in an Uncertain World:
    • About survival and thriving.
    • Adaptive behaviors in complex environments.
  • Bounded Rationality in Business:
    • Jeff Bezos’ Strategies:
      • Regret Minimization Framework: Making decisions based on future regrets.
      • Customer Obsession: Focus on customer needs for innovation.
      • Day One Mindset: Maintaining a startup mindset as the company grows.
      • Working Backwards Method: Building products based on customer needs.
      • The Amazon Flywheel: Leveraging customer experience to drive growth.
  • 2008 Financial Crisis Case Study – Countrywide Financial:
    • Ignored warning signs about the housing market.
    • Relied on the assumption of perpetual housing price increases.
    • Risky and aggressive lending practices.
    • Focus on short-term profits over long-term sustainability.
    • Upper management incentivized short-term performance metrics.
    • Predatory lending practices with little consideration of borrower qualifications.

Additional Case Studies

CharacteristicDescriptionCase Study
1. Limited Information ProcessingBounded rationality acknowledges that individuals have limited cognitive resources to process information and make decisions.Case Study: In investment decisions, investors often rely on heuristics or simplified rules of thumb, such as past performance, due to the complexity of financial markets.
2. Satisficing Rather than OptimizingPeople tend to satisfice, meaning they choose options that are “good enough” rather than seeking the optimal solution.Case Study: When choosing a restaurant for dinner, individuals may select the first one that meets their basic criteria, rather than conducting an exhaustive search for the absolute best restaurant.
3. Bounded Search for AlternativesDecision-makers may limit their search for alternatives, considering only a subset of available options.Case Study: When job hunting, individuals may apply to a few positions that appear suitable rather than exploring the full range of job openings.
4. Rule of Thumb Decision-MakingBounded rationality involves using simplified decision rules or heuristics to make choices quickly.Case Study: When shopping for groceries, shoppers may opt for familiar brands or items on sale, relying on heuristics to simplify the decision-making process.
5. Context-Dependent DecisionsDecisions are influenced by the context in which they are made, leading to different choices in varying situations.Case Study: A consumer may choose a luxury car in one context (e.g., for a special occasion) but opt for a budget-friendly car in another (e.g., for daily commuting).
6. Cognitive BiasesBounded rationality recognizes the presence of cognitive biases that affect decision-making, such as confirmation bias and overconfidence.Case Study: Investors may exhibit overconfidence bias, believing they can consistently beat the market, leading to riskier investment decisions.
7. Use of Decision RulesPeople often employ decision rules, such as “buy low, sell high” in investing, to simplify complex choices.Case Study: Traders in financial markets may use the “stop-loss” rule to limit losses by automatically selling an asset if its price falls below a certain threshold.
8. Sensitive to Framing EffectsThe way information is presented can significantly impact decisions, making individuals sensitive to framing effects.Case Study: A discount framed as “10% off” may be more appealing than the same discount framed as “a $10 discount” on a product.
9. Bounded Self-ControlIndividuals may struggle with self-control and make impulsive decisions due to limited willpower.Case Study: Shoppers may succumb to impulse purchases when confronted with in-store displays of tempting items at the checkout counter.
10. Bounded Time and EffortDecision-makers allocate limited time and effort to complex decisions, potentially leading to suboptimal outcomes.Case Study: When choosing a retirement plan, individuals may select the default option offered by their employer rather than investing time in a thorough evaluation of alternatives.
Related ConceptsDescriptionWhen to Apply
Bounded RationalityBounded Rationality suggests that decision-making is constrained by factors like cognitive limitations and incomplete information. Instead of fully rational choices, individuals often rely on simplified strategies or heuristics.– When understanding decision-making in situations with limited time or data. – When designing systems accommodating cognitive constraints. – When developing decision-support tools for more effective choices. – When analyzing past decisions to enhance future strategies.
SatisficingSatisficing is a decision-making strategy where individuals aim for a satisfactory or “good enough” outcome rather than an optimal one. It involves accepting the first solution that meets minimum requirements, rather than exhaustively searching for the best option.– When making decisions in complex or time-sensitive situations. – When resources or information are limited, and finding the best solution is impractical. – When prioritizing efficiency and pragmatism over perfection in decision-making processes. – When managing multiple competing priorities and trade-offs in decision-making.
HeuristicsHeuristics are mental shortcuts or rules of thumb that simplify decision-making by providing efficient strategies for problem-solving. They allow individuals to make quick judgments or estimates based on limited information, often leading to satisfactory outcomes but occasionally resulting in biases or errors.– When making decisions under time pressure or with limited information. – When faced with complex problems that require simplification for practical resolution. – When navigating uncertain or ambiguous situations where definitive solutions are unavailable. – When optimizing decision-making processes for efficiency and effectiveness in problem-solving.
Cognitive BiasesCognitive Biases are systematic patterns of deviation from rationality in judgment or decision-making, often influenced by subjective factors such as personal experiences, beliefs, or emotions. They can lead to errors in reasoning, perception, or interpretation of information, impacting individual and collective decision-making processes.– When analyzing information or making decisions, being aware of common cognitive biases that may affect judgment or distort perceptions. – When implementing decision-making frameworks or processes that mitigate the influence of cognitive biases through checks and balances. – When seeking diverse perspectives or independent review to challenge assumptions and counteract the impact of cognitive biases. – When training individuals or teams to recognize and address cognitive biases in critical thinking, problem-solving, and strategic planning activities.
Decision TreesDecision Trees are visual representations of decision-making processes, where each node represents a decision point, and branches represent possible outcomes or choices. They are useful for structuring complex decisions, evaluating alternatives, and assessing potential risks and rewards associated with different courses of action.– When analyzing complex decision-making scenarios with multiple variables or factors to consider. – When evaluating the potential outcomes and trade-offs associated with different choices or strategies. – When communicating decision-making processes and rationale to stakeholders or team members in a clear and structured manner. – When training individuals or teams on decision-making techniques and methodologies for problem-solving.
Rational Choice TheoryRational Choice Theory posits that individuals make decisions by weighing the costs and benefits of available options and choosing the one that maximizes utility or satisfaction. It assumes that individuals are rational actors who seek to optimize outcomes based on their preferences and constraints.– When modeling individual or organizational decision-making processes based on utility-maximizing principles. – When analyzing the factors influencing decision-making and predicting behavior in various contexts. – When designing incentives or policies to align individual choices with desired outcomes and objectives. – When evaluating the rationality of decision-making in economic, social, or political systems.
Behavioral EconomicsBehavioral Economics integrates insights from psychology and economics to understand how cognitive biases and irrational behaviors influence economic decision-making. It explores phenomena such as loss aversion, framing effects, and social preferences to develop more accurate models of human behavior in economic contexts.– When analyzing consumer behavior and market dynamics to inform marketing strategies and pricing policies. – When designing interventions or policies to nudge individuals toward more desirable behaviors or outcomes. – When studying decision-making in financial markets and assessing the impact of investor sentiment on asset prices. – When incorporating psychological insights into economic models and theories to improve predictive accuracy and policy effectiveness.
Prospect TheoryProspect Theory is a behavioral economic theory that describes how individuals evaluate and choose between risky or uncertain options. It suggests that people’s decisions are influenced by perceived gains and losses relative to a reference point, rather than objective probabilities. It also highlights phenomena such as loss aversion and the asymmetry of risk preferences.– When analyzing decision-making under risk or uncertainty in financial, investment, or insurance contexts. – When designing marketing campaigns or pricing strategies that leverage insights from prospect theory to influence consumer behavior. – When assessing individual or organizational risk attitudes and preferences in strategic decision-making. – When developing decision-support tools or frameworks that account for subjective perceptions of gains and losses in evaluating alternatives.
Adaptive Decision-MakingAdaptive Decision-Making involves continuously updating and revising decisions based on feedback, changing circumstances, or new information. It emphasizes flexibility, learning, and resilience in decision-making processes, allowing individuals or organizations to adjust strategies and tactics in response to evolving conditions.– When operating in dynamic or uncertain environments where conditions may change rapidly or unpredictably. – When managing projects or initiatives that require iterative decision-making and course corrections based on real-time feedback. – When fostering a culture of experimentation and innovation that encourages risk-taking and learning from failures in decision-making. – When developing decision-making frameworks or protocols that promote agility and responsiveness to changing circumstances or market conditions.
Crisis Decision-MakingCrisis Decision-Making involves making high-stakes decisions under extreme time pressure and uncertainty, often in response to emergencies or unexpected events. It requires rapid analysis, prioritization, and action to mitigate risks and minimize negative consequences. Effective crisis decision-making relies on clear communication, collaboration, and the ability to adapt to rapidly evolving situations.– When preparing for or responding to crisis situations such as natural disasters, security breaches, or public health emergencies. – When establishing crisis management protocols, roles, and responsibilities to facilitate coordinated decision-making and response efforts. – When conducting crisis simulations or tabletop exercises to test decision-making capabilities and identify areas for improvement. – When debriefing after crisis events to assess decision-making effectiveness.

Connected Thinking Frameworks

Convergent vs. Divergent Thinking

convergent-vs-divergent-thinking
Convergent thinking occurs when the solution to a problem can be found by applying established rules and logical reasoning. Whereas divergent thinking is an unstructured problem-solving method where participants are encouraged to develop many innovative ideas or solutions to a given problem. Where convergent thinking might work for larger, mature organizations where divergent thinking is more suited for startups and innovative companies.

Critical Thinking

critical-thinking
Critical thinking involves analyzing observations, facts, evidence, and arguments to form a judgment about what someone reads, hears, says, or writes.

Systems Thinking

systems-thinking
Systems thinking is a holistic means of investigating the factors and interactions that could contribute to a potential outcome. It is about thinking non-linearly, and understanding the second-order consequences of actions and input into the system.

Vertical Thinking

vertical-thinking
Vertical thinking, on the other hand, is a problem-solving approach that favors a selective, analytical, structured, and sequential mindset. The focus of vertical thinking is to arrive at a reasoned, defined solution.

Maslow’s Hammer

einstellung-effect
Maslow’s Hammer, otherwise known as the law of the instrument or the Einstellung effect, is a cognitive bias causing an over-reliance on a familiar tool. This can be expressed as the tendency to overuse a known tool (perhaps a hammer) to solve issues that might require a different tool. This problem is persistent in the business world where perhaps known tools or frameworks might be used in the wrong context (like business plans used as planning tools instead of only investors’ pitches).

Peter Principle

peter-principle
The Peter Principle was first described by Canadian sociologist Lawrence J. Peter in his 1969 book The Peter Principle. The Peter Principle states that people are continually promoted within an organization until they reach their level of incompetence.

Straw Man Fallacy

straw-man-fallacy
The straw man fallacy describes an argument that misrepresents an opponent’s stance to make rebuttal more convenient. The straw man fallacy is a type of informal logical fallacy, defined as a flaw in the structure of an argument that renders it invalid.

Streisand Effect

streisand-effect
The Streisand Effect is a paradoxical phenomenon where the act of suppressing information to reduce visibility causes it to become more visible. In 2003, Streisand attempted to suppress aerial photographs of her Californian home by suing photographer Kenneth Adelman for an invasion of privacy. Adelman, who Streisand assumed was paparazzi, was instead taking photographs to document and study coastal erosion. In her quest for more privacy, Streisand’s efforts had the opposite effect.

Heuristic

heuristic
As highlighted by German psychologist Gerd Gigerenzer in the paper “Heuristic Decision Making,” the term heuristic is of Greek origin, meaning “serving to find out or discover.” More precisely, a heuristic is a fast and accurate way to make decisions in the real world, which is driven by uncertainty.

Recognition Heuristic

recognition-heuristic
The recognition heuristic is a psychological model of judgment and decision making. It is part of a suite of simple and economical heuristics proposed by psychologists Daniel Goldstein and Gerd Gigerenzer. The recognition heuristic argues that inferences are made about an object based on whether it is recognized or not.

Representativeness Heuristic

representativeness-heuristic
The representativeness heuristic was first described by psychologists Daniel Kahneman and Amos Tversky. The representativeness heuristic judges the probability of an event according to the degree to which that event resembles a broader class. When queried, most will choose the first option because the description of John matches the stereotype we may hold for an archaeologist.

Take-The-Best Heuristic

take-the-best-heuristic
The take-the-best heuristic is a decision-making shortcut that helps an individual choose between several alternatives. The take-the-best (TTB) heuristic decides between two or more alternatives based on a single good attribute, otherwise known as a cue. In the process, less desirable attributes are ignored.

Biases

biases
The concept of cognitive biases was introduced and popularized by the work of Amos Tversky and Daniel Kahneman in 1972. Biases are seen as systematic errors and flaws that make humans deviate from the standards of rationality, thus making us inept at making good decisions under uncertainty.

Bundling Bias

bundling-bias
The bundling bias is a cognitive bias in e-commerce where a consumer tends not to use all of the products bought as a group, or bundle. Bundling occurs when individual products or services are sold together as a bundle. Common examples are tickets and experiences. The bundling bias dictates that consumers are less likely to use each item in the bundle. This means that the value of the bundle and indeed the value of each item in the bundle is decreased.

Barnum Effect

barnum-effect
The Barnum Effect is a cognitive bias where individuals believe that generic information – which applies to most people – is specifically tailored for themselves.

First-Principles Thinking

first-principles-thinking
First-principles thinking – sometimes called reasoning from first principles – is used to reverse-engineer complex problems and encourage creativity. It involves breaking down problems into basic elements and reassembling them from the ground up. Elon Musk is among the strongest proponents of this way of thinking.

Ladder Of Inference

ladder-of-inference
The ladder of inference is a conscious or subconscious thinking process where an individual moves from a fact to a decision or action. The ladder of inference was created by academic Chris Argyris to illustrate how people form and then use mental models to make decisions.

Six Thinking Hats Model

six-thinking-hats-model
The Six Thinking Hats model was created by psychologist Edward de Bono in 1986, who noted that personality type was a key driver of how people approached problem-solving. For example, optimists view situations differently from pessimists. Analytical individuals may generate ideas that a more emotional person would not, and vice versa.

Second-Order Thinking

second-order-thinking
Second-order thinking is a means of assessing the implications of our decisions by considering future consequences. Second-order thinking is a mental model that considers all future possibilities. It encourages individuals to think outside of the box so that they can prepare for every and eventuality. It also discourages the tendency for individuals to default to the most obvious choice.

Lateral Thinking

lateral-thinking
Lateral thinking is a business strategy that involves approaching a problem from a different direction. The strategy attempts to remove traditionally formulaic and routine approaches to problem-solving by advocating creative thinking, therefore finding unconventional ways to solve a known problem. This sort of non-linear approach to problem-solving, can at times, create a big impact.

Bounded Rationality

bounded-rationality
Bounded rationality is a concept attributed to Herbert Simon, an economist and political scientist interested in decision-making and how we make decisions in the real world. In fact, he believed that rather than optimizing (which was the mainstream view in the past decades) humans follow what he called satisficing.

Dunning-Kruger Effect

dunning-kruger-effect
The Dunning-Kruger effect describes a cognitive bias where people with low ability in a task overestimate their ability to perform that task well. Consumers or businesses that do not possess the requisite knowledge make bad decisions. What’s more, knowledge gaps prevent the person or business from seeing their mistakes.

Occam’s Razor

occams-razor
Occam’s Razor states that one should not increase (beyond reason) the number of entities required to explain anything. All things being equal, the simplest solution is often the best one. The principle is attributed to 14th-century English theologian William of Ockham.

Mandela Effect

mandela-effect
The Mandela effect is a phenomenon where a large group of people remembers an event differently from how it occurred. The Mandela effect was first described in relation to Fiona Broome, who believed that former South African President Nelson Mandela died in prison during the 1980s. While Mandela was released from prison in 1990 and died 23 years later, Broome remembered news coverage of his death in prison and even a speech from his widow. Of course, neither event occurred in reality. But Broome was later to discover that she was not the only one with the same recollection of events.

Crowding-Out Effect

crowding-out-effect
The crowding-out effect occurs when public sector spending reduces spending in the private sector.

Bandwagon Effect

bandwagon-effect
The bandwagon effect tells us that the more a belief or idea has been adopted by more people within a group, the more the individual adoption of that idea might increase within the same group. This is the psychological effect that leads to herd mentality. What in marketing can be associated with social proof.

Read Next: BiasesBounded RationalityMandela EffectDunning-Kruger

Read Next: HeuristicsBiases.

Main Free Guides:

Other resources for your business:

Leave a Reply

Discover more from FourWeekMBA

Subscribe now to keep reading and get access to the full archive.

Continue reading

Scroll to Top
FourWeekMBA