What is Occam’s Broom?

Occam’s broom was first proposed by South African microbiologist Sidney Brenner who proposed that inconvenient facts that do not fit into someone’s hypothesis or serve their agenda are swept aside or hidden. Occam’s broom is a principle stating that inconvenient facts are hidden or obscured to draw important conclusions or argue points.

Understanding Occam’s broom

Brenner’s idea was supported by author Daniel Dennett in his 2013 book Intuition Pumps and Other Tools for Thinking.

Dennett noted that the practice was common among intellectual but dishonest advocates of one theory or another and was especially insidious when directed toward a general public who may not know to tell fact from fiction. 

In addition, Dennett said that:

conspiracy theorists are masters of Occam’s Broom, and an instructive exercise on the Internet is to look up a new conspiracy theory, to see if you (a nonexpert on the topic) can find the flaws, before looking elsewhere on the web for the expert rebuttals.

Occam’s brook in the context of science

While Dennett linked the idea with conspiracy theorists, it should be noted that as a scientist, Brenner coined the term to describe the tendency of his peers to overlook data not in support of their arguments.

In a 2009 article in the Journal of Biology, Miranda Robertson noted that despite its negative connotations, Occam’s broom did have a place in science and was in fact necessary for scientific progress.

Robertson noted that ”Biology, as many have pointed out, is untidy and accidental, and it is arguably unlikely that all the facts can be accounted for early in the investigation of any given biological phenomenon.

She then went on to discuss Occam’s broom in the context of Charles Darwin’s discovery of natural selection and Mendel’s discovery of the fundamental laws of inheritance.

If Darwin had “swept away” variation in the ratios of inheritance he recorded (as Mendel did), then many believe Darwin would have discovered the laws of inheritance before his counterpart. 

Other examples of Occam’s broom

Negative economic outlooks could also be considered an example of Occam’s broom.

When the media report on a company’s debt or impending bankruptcy, they only focus on the liabilities of the balance sheet and not its assets.

In a similar vein, those who want to make predictions about a country’s debt obligations should not neglect its earning power, collective knowledge, innovation capacity, and capital base.

In the United States, for example, households held over $113 trillion in assets in 2018 – equivalent to five times the amount of goods and services produced annually.

Key takeaways

  • Occam’s broom is a principle stating that inconvenient facts are “swept under the carpet” to draw important conclusions or make counterarguments.
  • Occam’s broom is common among intellectual but dishonest advocates of one theory or another. Some may push conspiracy theories, while academics such as scientists may also discount facts that do not support their hypotheses.
  • Negative economic outlooks could also be considered an example of Occam’s broom. When media organizations make doom forecasts about a country’s economy, they may avoid considering its asset base and potential for growth and innovation.

Occam’s Broom vs. Occam’s Razor

Occam’s Razor states that one should not increase (beyond reason) the number of entities required to explain anything. All things being equal, the simplest solution is often the best one. The principle is attributed to 14th-century English theologian William of Ockham.

Whereas Occam’s Razor is a heuristic that helps decide in a complex scenario by going for the most straightforward solution as the best one.

Occam’s Broom is more of a bias, in which intellectuals fall as a confirmation bias, where these people tend to select facts that fit into their narratives.

Thus, reinforcing their hypotheses. This is a negative feedback loop mechanism, where someone selects and cherry-picks the facts that most fit into the outcome she/he has in mind.

Connected Business Heuristics

First-Principles Thinking

First-principles thinking – sometimes called reasoning from first principles – is used to reverse-engineer complex problems and encourage creativity. It involves breaking down problems into basic elements and reassembling them from the ground up. Elon Musk is among the strongest proponents of this way of thinking.

Ladder Of Inference

The ladder of inference is a conscious or subconscious thinking process where an individual moves from a fact to a decision or action. The ladder of inference was created by academic Chris Argyris to illustrate how people form and then use mental models to make decisions.

Six Thinking Hats Model

The Six Thinking Hats model was created by psychologist Edward de Bono in 1986, who noted that personality type was a key driver of how people approached problem-solving. For example, optimists view situations differently from pessimists. Analytical individuals may generate ideas that a more emotional person would not, and vice versa.

Second-Order Thinking

Second-order thinking is a means of assessing the implications of our decisions by considering future consequences. Second-order thinking is a mental model that considers all future possibilities. It encourages individuals to think outside of the box so that they can prepare for every and eventuality. It also discourages the tendency for individuals to default to the most obvious choice.

Lateral Thinking

Lateral thinking is a business strategy that involves approaching a problem from a different direction. The strategy attempts to remove traditionally formulaic and routine approaches to problem-solving by advocating creative thinking, therefore finding unconventional ways to solve a known problem. This sort of non-linear approach to problem-solving, can at times, create a big impact.

Moonshot Thinking

Moonshot thinking is an approach to innovation, and it can be applied to business or any other discipline where you target at least 10X goals. That shifts the mindset, and it empowers a team of people to look for unconventional solutions, thus starting from first principles, by leveraging on fast-paced experimentation.


The concept of cognitive biases was introduced and popularized by the work of Amos Tversky and Daniel Kahneman in 1972. Biases are seen as systematic errors and flaws that make humans deviate from the standards of rationality, thus making us inept at making good decisions under uncertainty.

Bounded Rationality

Bounded rationality is a concept attributed to Herbert Simon, an economist and political scientist interested in decision-making and how we make decisions in the real world. In fact, he believed that rather than optimizing (which was the mainstream view in the past decades) humans follow what he called satisficing.

Dunning-Kruger Effect

The Dunning-Kruger effect describes a cognitive bias where people with low ability in a task overestimate their ability to perform that task well. Consumers or businesses that do not possess the requisite knowledge make bad decisions. What’s more, knowledge gaps prevent the person or business from seeing their mistakes.

Mandela Effect

The Mandela effect is a phenomenon where a large group of people remembers an event differently from how it occurred. The Mandela effect was first described in relation to Fiona Broome, who believed that former South African President Nelson Mandela died in prison during the 1980s. While Mandela was released from prison in 1990 and died 23 years later, Broome remembered news coverage of his death in prison and even a speech from his widow. Of course, neither event occurred in reality. But Broome was later to discover that she was not the only one with the same recollection of events.

Crowding-Out Effect

The crowding-out effect occurs when public sector spending reduces spending in the private sector.

Bandwagon Effect

The bandwagon effect tells us that the more a belief or idea has been adopted by more people within a group, the more the individual adoption of that idea might increase within the same group. This is the psychological effect that leads to herd mentality. What in marketing can be associated with social proof.

Read Next: BiasesBounded RationalityMandela EffectDunning-Kruger EffectLindy EffectCrowding Out EffectBandwagon Effect.

Scroll to Top