Loss aversion is a bias, according to psychologists, where people attribute a stronger psychological weight to losses than to gains. Some psychological studies show that people place more weight on a loss, compared to a gain of the same size. This is labeled loss aversion.
Constructive Paranoia

Polymath Jared Diamond, in his book, The World Until Yesterday, talks about constructive paranoia.
He learned this concept when leaving with several tribes in New Guinea.
For instance, those tribes had a cultural norm to avoid sleeping under big trees due to a seemingly irrational fear they might fall.
Indeed, there is a very low probability of that happening. And if you’re a statistician you’d think those people are mad.
However, there is an important point to take into account.
If a big tree falls, no matter what, if you find yourself beneath, there is no way back, you’re dead.
And another critical point, is those people live in the forests, every day.
Thus, they are exposed to these big trees, frequently.
In other words, frequency and expected outcome, make the tribe from New Guinea leverage constructive paranoia. This is what bounded rationality does.
It helps us, naturally develop, antibodies against a world, that is noisier and noisier.
In most real-life scenarios, everyday people know that some domains of potential losses carry hidden risks, which as they can’t be computed, are ignored by psychologists, but instead are not hidden from the human mind.
So better be paranoid than a dead smart person.
Tribesmen know better while some modern psychologists have forgotten.
A labeling problem?
What if what’s been labeled as risk aversion – in some domains – is just constructive paranoia in a highly uncertain scenario?
Take the case of how psychologists have analyzed the scenario where people feared more losing money than making money.
Where the aversion of losing money is felt (psychologically) twice, compared to that of making money.
But is this really irrational?
I’ve been investing for a long time, and if there is one sure thing, it’s the asymmetry of loss.
Take this very simple example. You have $1000 invested.
If you earn 20% you make $200 and you have $1200.
However, it only takes a 16% portfolio loss, to go back to $1000.
Take the opposite scenario, you lose 20%, and you now have $800.
Yet, to go back to $1000, you need to increase your portfolio by 25%.
In other words, do you get the asymmetry here?
If you earn 20%, you’ll get back with only a 16% portfolio loss.
But if you lose 20%, you need to increase your portfolio, by 25% to earn back the losses!
This means, that even if we do a bit of math, the brain seems to grasp that.
Not only it, but our brain also seems to be wired to avoid the risk of ruin.
And modern society, and psychologists, do all they can, to make us forget, those built-in rules.
To conclude, the real world is about satisficing!

Psychologist Herbert Simon explained that in a complex world, you don’t want to optimize. You want to satisfice.
Satisficing is about making a decision in an uncertain world, there information is incomplete, the problem is undefined, and the context is wide.
And satisficing, in complex situations, work actually, way better than modeling.
It’s, in short, the opposite, of what many business people do. With the proliferation of big data, they think they can easily model the world, forgetting that the model, is such a simplified version of reality, that it doesn’t work, in the first place.
Not only that, the model, tends to stress a few (visible) metrics, that have the potential to kill the whole thing.
Indeed, one of the worst things startups can do is the so-called “premature optimization” or for instance trying to automate, important stuff, too early.
Think of the case of a startup that doesn’t still understand what users appreciate about the project, automating right on, the demo of the product.
This is bad because 1. the demo won’t be effective 2. the startup will lose an important feedback loop to improve the software, quickly 3. the first customers might also become your core channels, and in that stage, trust is the key, and you don’t want to automate that.
That is why it’s critical, as a business person, you keep refining your BS detector, over time.
Loss Aversion and Asymmetric Betting
As we saw, the loss aversion more than a bias, is the byproduct of dealing in the real-world.
Where you want to prevent major screw-ups.
In addition, in most cases, what might make us loss averse might be due to our intuitive understanding of the real-world context.
When we get the feeling that something is irreversible, and it carries hidden costs, that is when loss aversion kicks in.
And in most cases, we’re correct.
That is why I created for you a speed-reversibility matrix.

The main goal here is to unlock those experiments, which I like to call asymmetric business bets.

Those are experiments with a high potential, limited downside, and no hidden costs (as the experiment is mostly reversible).
Finding them is not easy, as it takes a huge amount of iteration, tweaking and experimentation.
Yet, when you stuble on those asymmetric bets they turn into growth hacks.
Here, it’s critical to realize that those are not “hacks” which are readily available to anyone.
Those hacks come out as a result of an experimental process which is rigorous, and inbued within your business practices.
Thanks to that, you can unlock incredible growth. Yet, this process is iterative, expensive, and there is no short-cut to it.
By practicing the speed-reversibility mindset, you can unlock asymmetric bets, which can contribute to the growth of your business.
Like hidden gems they are everywhere, ready to be discovered, yet, it’s critical to have a process of continuous experimentation to find them.
Key takeaways
- Loss aversion is not a bias. It’s the built-in detector, which makes humans avoid irreversible screw-ups.
- Satisficing is the process of using heuristics for decision-making in a complex world, and those, in most cases, work way better than complex scenario analyses.
- BS detector: as a business person your BS detector becomes the critical filter, and compass that helps you make decisions in a complex world.
Read this to understand the whole point.
Connected Thinking Frameworks
Convergent vs. Divergent Thinking


























Read Next: Biases, Bounded Rationality, Mandela Effect, Dunning-Kruger
Read Next: Heuristics, Biases.