Goodhart’s Law is named after British monetary policy theorist and economist Charles Goodhart. Speaking at a conference in Sydney in 1975, Goodhart said that “any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.” Goodhart’s Law states that when a measure becomes a target, it ceases to be a good measure.
Contents
Understanding Goodhart’s Law
Goodhart would later admit that his quip was intended to be a humorous, throw-away comment. But it was nevertheless an accurate and perceptive observation about how the modern world functions.
It’s important to note that Goodhart himself had no role in naming the law for which he is named. That distinction goes to anthropologist Marilyn Strathern, who argued in a 1997 paper that the law had uses beyond statistics to evaluation in a broader sense.
An oft-told story of Goodhart’s Law at work can be described by the cobra effect. In India under British colonial rule, the government was troubled by the number of venomous cobras. To reduce their population, the government placed a bounty on every cobra the locals could catch. This strategy worked for a while, but some individuals began breeding the cobras only to kill them later and collect a higher bounty.
Eventually, the colonial government caught on and scrapped the scheme, causing many of the bred cobras to be released into the wild. The key takeaway of the cobra effect story is that incentives designed to solve a problem end up rewarding people for making the problem worse.
The four forms of Goodhart’s Law
There are generally accepted to be four variations on Goodhart’s Law:
- Regressive Goodhart – here, the measures individuals use for their target (goal) are imperfectly correlated with that goal. For example, weight is imperfectly correlated with health because it encourages skipping meals or weighing oneself in the morning with an empty stomach.
- Extremal Goodhart – this occurs when a measurement is picked because it correlates with a goal in normal situations. In extreme circumstances however, the measure is erroneous. The human relationship with sugar is a classic example. While sugar was correlated with survival thousands of years ago, the same cannot be said of modern, sedentary lifestyles where sugar promotes obesity.
- Causal Goodhart – where the behavior of an individual does not directly affect the goal but has some causal effect on the measure. The number of times a gym membership is renewed does not directly impact how often an individual exercises, for example.
- Adversarial Goodhart – where other goals confound the goal a measure is trying to accomplish, such as the cobra effect mentioned above.
Avoiding the impact of Goodhart’s Law
Of the four variations of Goodhart’s Law, only the Regressive Goodhart is unavoidable.
For the remaining three, here are some simple avoidance tips:
- Conduct regular checks to ensure the measure is still incentivizing in line with the desired outcome or goal.
- Become aware of Goodhart’s Law and how it operates.
- Maintain a focus on the end goal while using the measures as a guide only.
- Reduce bureaucracy and formalism.
- Use a combination of diversified metrics. A balanced scorecard can be useful here.
Key takeaways:
- Goodhart’s Law states that when a measure becomes a target, it ceases to be a good measure.
- Goodhart’s Law was informally coined during a speech by Charles Goodhart. Although the economist was speaking in the context of statistics, the law has broader evaluative applications.
- Goodhart’s Law is generally categorized into four variations: Regressive Goodhart, Extremal Goodhart, Causal Goodhart, and Adversarial Goodhart.
Connected Business Concepts













Read Next: Biases, Bounded Rationality, Mandela Effect, Dunning-Kruger
Read Next: Heuristics, Biases.
Main Free Guides: