You are on page 1of 4

Goodhart’s Law Rules the 

Modern World.
Here Are Nine Examples
 “Any observed statistical regularity will tend to collapse once pressure is
placed upon it for control purposes.” True.

By
Peter Coy
26 March 2021, 8:31 pm GMT+8

This month the magazine Central Banking gave a lifetime


achievement award to Charles Goodhart, 84, a creative and prolific theorist
of monetary policy who has spent most of his career at the Bank of
England and the London School of Economics. Last year he and Manoj
Pradhan came out with a well-received book called The Great
Demographic Reversal: Ageing Societies, Waning Inequality, and an
Inflation Revival.

But never mind all that. In years to come, Goodhart may be best
remembered for something he said at a conference in Sydney, Australia, in
1975 that he later admitted “was intended as a humorous, throw-away line.”
Humorous, yes, but also perceptive. His observation is cited regularly today
in fields ranging from banking to medicine to artificial intelligence because it
says something important about how the modern world functions. 
Charles GoodhartPHOTOGRAPHER: MUNSHI AHMED/BLOOMBERG

What Goodhart said 46 years ago in Sydney was this, which he jokingly
termed Goodhart’s Law: “Any observed statistical regularity will tend to
collapse once pressure is placed upon it for control purposes.” In other
words, as the British anthropologist Marilyn Strathern later boiled it down:
“When a measure becomes a target, it ceases to be a good measure.”
Goodhart’s Law is a close cousin of Campbell’s law, which was enunciated
around the same time by psychologist Donald Campbell, and the Lucas
Critique by economist Robert Lucas.

An oft-cited example of Goodhart’s Law in action is the bounty on cobras


that the British who ruled India supposedly paid to try to reduce the
population of the snakes. Enterprising Indians quickly figured that they
could earn money by raising cobras to kill and present for bounties.
Likewise when Soviet planners ordered nail factories to increase the
number of nails they produced, it’s said that managers reacted by
producing millions of tiny, useless nails. When the planners wised up and
switched to a weight criterion, the factories started producing giant, heavy,
and equally useless nails. It’s hard to pin down the historical truth of these
stories, but the point is clear.

Once you get Goodhart’s Law on your brain, you start seeing it at work
wherever you look. Here are examples:
1. The Federal Reserve has a published target for inflation of 2%. In
2018, Fed Governor Randal Quarles cited Goodhart’s Law in warning
that inflation could get out of control if the Fed relies on that rate “as
our only gauge of the economy's position relative to its potential.”
Inflation might respond slowly at first to economic overheating,
fooling policymakers into thinking all was well, “but I have no doubt
that prices would eventually move up in response to resource
constraints,” Quarles said.
2. The Bank for International Settlements documented in 2018 that
European banks were getting around safety standards by “window-
dressing” their financial statements to look safer at the end of each
quarter. American banks had a harder time gaming the regulations
because their results were evaluated on a quarterly average rather
than at quarter’s end.
3. Wells Fargo & Co.  paid $480 million in 2018 to settle a class-action
lawsuit in which investors accused the bank of securities fraud
related to opening unauthorized accounts for customers. It was a
classic example of Goodhart’s Law: Wells Fargo employees were
under strong pressure to cross-sell financial products, so to meet
their goals they simply faked new accounts.
4. For scholars, being frequently cited is a path to tenure and stardom,
so the temptation to manipulate citations is huge. Sundarapandian
Vaidyanathan, a computer scientist at the Vel Tech R&D Institute of
Technology, won an award from the Indian government for being
among the nation’s top researchers by productivity and citations. But
in 2019 the journal Nature wrote that 94% of the citations of his work
through 2017 came from himself or co-authors. He has defended
himself, writing that “the next work cannot be carried on without
referring to previous work.”
5. In higher education, colleges manipulated their data to achieve a
higher spot in the influential U.S. News & World Report rankings.
Since the number of applications counts, some colleges counted
every postcard expressing interest as an application. And since a
high rejection rate is considered prestigious, some colleges “rejected”
applicants for fall admission, only to admit them for the spring
semester. 
6. In 2015, a jury in Atlanta convicted 11 teachers of racketeering and
other charges because they cheated on high-stakes standardized
tests by altering and fabricating students’ answer sheets. Schools
Superintendent Beverly Hall stood to receive a substantial bonus if
system-wide test scores rose sufficiently. She died a month before
the jury ruled, having long denied any wrongdoing.
7. The World Bank’s ease-of-doing-business ranking was so frequently
gamed that in December the bank put out a 21-page review of data
irregularities. Azerbaijan, China, Saudi Arabia, and the United Arab
Emirates used “undue pressure” to get changes made “outside of the
appropriate review process,” the bank said.
8. In health care, Goodhart’s Law complicates efforts to rank
hospitals. The New York Times reported in 2018 on a case where a
sick, 81-year-old veteran was denied admission to the Roseburg
Veterans Administration Medical Center in Oregon as part of the
hospital’s attempt “to lift its quality-of-care ratings.” The hospital’s
director acknowledged to the Times that being more selective had
improved ratings, but denied that the hospital was turning patients
away to improve scores.
9. A new problem area is artificial intelligence and machine learning.
Human beings are good at exploiting Goodhart’s Law, but computers
are even better. For example, a deep neural network that was
intended to detect pneumonia in chest scans did well overall, but
failed on scans coming from new hospitals. It turned out that the
system had taken a shortcut: Instead of looking at the scans, it
zeroed in on the names of the hospitals where the scans came from.
Certain hospitals had higher rates of pneumonia. “Together with the
hospital’s pneumonia prevalence rate it was able to achieve a
reasonably good prediction—without learning much about pneumonia
at all,” says an article by Canadian and German authors, Shortcut
Learning in Deep Neural Networks, shared online last year via arXiv. 

What to do about this? It’s wise to get away from fixed, changeless rules
that can be easily gamed. Also, measure what you actually want, not a
rough proxy for it. Try using multiple criteria instead of a single standard.
But the first step is simply to be aware of the problem. Putting a name and
a focus on the pitfalls of measurement and reward is the lasting
contribution of Charles Goodhart. 

You might also like