Professional Documents
Culture Documents
Anchoring
Framing Effect
Unit Effect
Denomination Effect
Sounds Misleading
Endowment Effect
Loss Aversion
Untracked Spending
Commitment Device
Decoy Effect
Pre-Order Bias
Present Bias
Opportunity Costs
Change Blindness
4. Uncertain About Uncertainty: Calculate The Probability
Conjunction Fallacy
Selection Bias
Survivorship Bias
Confirmation Bias
Coincidences
Disagreeing Experts
Winner’s Curse
5. Consistency And Social Norms: Question The Rules
Priming Effect
Expensive Expectation
Money Comparisons
Untested Preferences
Speeding
Volunteer’s Dilemma
Conclusion
1. You Can’t Get Fooled Again
Traditional economics characterizes people as rational agents,
meaning they have logical preferences, they account for risk
using probability, and they maximize their utility. Behavioral
economics points out people are not well described by the
rational agent model. Many times people have inconsistent
preferences, they overreact to risk, and they act against
individual interest. The systematic deviations from the rational
agent model are known as cognitive biases, decision traps,
decision fallacies, or irrational decisions.
People often argue which side is “right.” In this book, I want to
bridge the gap by pointing out both perspectives are useful. It is
true cognitive biases can lead to bad decisions. However, we can
learn to do better. We can recognize the biases and then train
ourselves to make better decisions.
To see what I mean, consider the following graphic.
How would you describe the horizontal lines? At first glance you
would say the horizontal lines are curved and skewed. But look
closely and measure the horizontal lines against a straight edge.
In reality, the horizontal lines are all straight and parallel to each
other. This is the café wall illusion.
In a way, every time you see this illusion, you will get tricked.
The brain misperceives this arrangement of lines and shapes.
But in another way, you will never be tricked again. Your brain
can recognize this pattern of lines and rectangles as an optical
illusion. If you are shown this illusion again, you would be on
guard for it. You would correct your bias and realize the
horizontal lines are straight and parallel.
Cognitive biases are like optical illusions. They exist and we
tend to get tricked by them. By the same token, we can learn
never to be tricked again. Just as we can recognize optical
illusions and differentiate perception from reality, we can be alert
for cognitive biases and distinguish between a decision trap and
a rational choice.
Daniel Kahneman is one economist who has been recognized
with a Nobel Prize for his research into behavioral economics.
Kahneman characterizes our brain as having two modes of
thinking [1]. System I is “fast” thinking that relates to instincts
and gut reactions. System II is “slow” thinking that relates to
deliberate and logical plans. When we see an optical illusion, we
get fooled by the automatic and instinctual response of system I
thinking. When we study the illusion closer using system II
thinking and realize the pattern is misleading, we can overcome
the bias.
There is an old saying: “Fool me once, shame on you. Fool me
twice, shame on me.” While visiting Tennessee in 2002,
President George W. Bush had a momentary memory lapse. He
ended up uttering: “There’s an old saying in Tennessee — I
know it’s in Texas, probably in Tennessee — that says, ‘Fool me
once, shame on…shame on you. Fool me — you can’t get fooled
again.’”
Everyone made fun of the gaffe; they called it another
“Bushism.” But perhaps W. Bush was making sense after all. If
you get fooled once, then shame on the other person for
exploiting the cognitive bias. Then, at that point, you can
recognize the game. If someone tries to fool you again, they
simply can’t! You have identified the cognitive bias and learned
the rational response.
Behavioral biases are similar to tourist traps. While sightseeing a
new city, you may end up in crowded, overpriced shops selling
junk. But you only need to fall for a tourist trap once. The next
time you will avoid the area. Nowadays you can avoid the tourist
traps entirely by planning with Yelp, TripAdvisor, or asking
friends. Similarly, you are likely to fall for a behavioral bias the
first time you encounter a new decision. But nowadays you can
learn about the biases in advance (by, say, reading this book) so
you can avoid making a bad decision in the first place.
The book is a practical guide to overcoming common behavioral
biases. In each section, I will describe common mistakes and
why they happen. Then I will suggest techniques to help you
make the smart decision. Over time you will recognize the
cognitive biases as naturally as you can spot optical illusions.
You won’t get fooled again.
A thought experiment
Before I get into the biases, I want to give an example to
illustrate the difference between rational agent theory and
behavioral economics.
Imagine you’re in a psychology experiment about decision-
making. You can select one of the choices.
(A) $100 prize now
(B) $200 prize if you wait 15 minutes.
What do you do? I think almost everyone would select (B) $200
in 15 minutes. And universally we would agree that is the smart
decision.
Now imagine the experiment is framed differently. Imagine you
are a 5-year old and a marshmallow is put in front of you. Which
choice do you select?
(A) Eat a single marshmallow now
(B) Earn a 2nd marshmallow by waiting for 15 minutes.
Again, the answer choice is obvious. Logically choice (B) is
better again as you can double the reward by waiting for 15
minutes.
But here’s the thing: that’s not what most children actually do.
The Stanford marshmallow experiment found that only about 30
percent of the children waited for 15 minutes [2]. A follow-up
study more than 10 years later suggested good things come to
those who can wait. The children who delayed gratification were
found to have higher educational achievement and a lower body
mass index.
Does that mean (B) is the correct answer? That choice (B) is the
rational thing to do?
This is where the story gets slightly complicated by the history
of economics.
How rational became a bad word
Let’s return to the first experiment. You can either (A) claim a
$100 prize now, or (B) claim a $200 prize if you wait 15
minutes.
Pretty much everyone would say (B) is the better choice. You
double your money just by waiting for a few minutes.
But think about variations of the experiment. Imagine you had to
wait 1 year to claim the $200 prize. Is it still worth it? What if
you had to wait 2 years, or 5 years, or 10 years?
When you think about it, the experiment is really about the
tradeoff between a current reward and a future reward. What is
the correct way to evaluate time versus money?
Traditional economics developed the rational agent model to
answer this and other questions. The rational agent model proved
useful. By quantifying costs, benefits, time, and the rules for
evaluating tradeoffs, it was possible to mathematically solve for
the choice that gave the highest reward. In other words, it was
possible to solve for the correct choice under the assumption of
rationality.
Furthermore, there was hope economics could also predict how
markets would behave by considering the interaction of many
rational agents.
This was probably taking the theory too far. Humans do not
always act according to the rational agent model. In fact, there
are often good reasons we act differently.
Let’s return to the marshmallow experiment. You can either (A)
eat the marshmallow now, or (B) earn a 2nd marshmallow by
waiting for 15 minutes.
An appropriately specified rational agent model would conclude
(B) is the correct choice. You get double the marshmallows and
you only have to wait for a few minutes.
Now here’s what it gets interesting: most kids don’t make the
rational decision! The Stanford marshmallow experiment found
that only about 30 percent were able to delay gratification.
Why do 70 percent make choice (A) instead? Is there a reason
we are “predictably irrational”?
The field of behavioral economics considers the psychological
reasons and cognitive factors for decision-making. Many times
the decisions we make are not well described by the rational
agent model.
A behavioral economist might point out children are only
boundedly rational: some may not yet realize that waiting 15
minutes for double the reward is a good tradeoff, or others might
not have the willpower to execute the plan. Or the behavioral
economist might say the children are satisfycing: they actually
don’t want 2 marshmallows to maximize their payout; they
would be happy eating 1 marshmallow so that is the optimal
bliss point. Isn’t it a good thing if kids eat less sugar anyway?
The behavioral economist might also suggest the experiment
framing was important: the presentation of the marshmallow
choice may have primed them to seek immediate gratification.
Perhaps the children would act differently if the choice was
about money or presented in another fashion. There may be
many other explanations for the behavior as well.
The behavioral economics observations may be accurate for how
we make decisions. And it might also be useful for predicting
what people do. The rational agent model of waiting 15 minutes
was chosen by only 30 percent; a model that suggests children
take 1 marshmallow would have a 70 percent success rate.
You can see why behavioral economics might be favored over
the rational agent theory. I suspect behavioral economics also has
popular appeal because people can relate to its explanations. Plus
behavioral economics can be comforting: isn’t it nice to hear
other people make bad mistakes too? It’s like saying it’s okay to
take the single marshmallow: it’s not really a good idea, but the
70 percent want to hear it.
Behavioral economics is the subject of best-selling books and its
ideas are also shaping policy as the UK and USA governments
want to “nudge” people to select healthy foods, reduce energy
consumption, and save for retirement.
And in the marketing of these books and ideas, people have
somehow gotten to the point they think rational is a bad word.
Be rational
You know better. Most of America would pick the single
marshmallow. But the popular decision is not always the rational
one.
The 30 percent who could wait made the rational decision. They
were not necessarily smarter or genetically superior individuals.
It turns out many of them were tempted to take the 1
marshmallow too. But they were able to make the rational choice
because they had a strategy. They either understood the tradeoff
logically, or they employed strategies like distracting themselves
so waiting was easier. They recognized the bias and overcame it.
That’s the idea of this book. While most of us do not make the
rational decision automatically, we can identify situations where
we make mistakes and develop strategies to make the smart
decision.
Let’s consider how most of us are with physical health. By
nature most of us opt for junk foods and do not do exercise. You
could say we are “irrational” about health. But a few of us are
exceptional physically. Professional athletes know there are ways
to become fit. They follow strict exercise routines and diet
regimens that help them build muscle, gain speed, and shed body
fat.
Similarly, by nature we tend to make some decisions that are not
rational. But we don’t have to settle for that. If we understand the
possible ways we make bad decisions, we can figure out ways to
counteract them. Over time we can practice to make smarter
decisions and that becomes a habit.
To use another analogy, the brain is like a software operating
system. When you buy a computer or a phone, the device
contains software to help you accomplish various goals. But the
software also contains security bugs that hackers can learn and
exploit. Periodically the software has to be updated with security
patches. The human brain similarly is “pre-loaded” with tools to
help you accomplish various goals. But the brain misperceives
some stimuli which others can learn and exploit. The brain needs
to be updated with new information to avoid falling for cognitive
biases.
In this book I want to help you make smart decisions. I will go
over many cognitive biases and then give tips on how to
overcome them and see through the irrationality illusion.
Notes
[1] This is from Daniel Kahneman’s 2011 bestselling book
Thinking, Fast and Slow.
[2] Walter Mischel’s book The Marshmallow Experiment
explains the background and results. The original experiment
had a reward of a pretzel, cookie, or marshmallow, but the name
“marshmallow experiment” has stuck. Mischel, Walter, Ebbe B.
Ebbesen, and Antonette Raskoff Zeiss. “Cognitive and
Attentional Mechanisms in Delay of Gratification.” Journal of
Personality and Social Psychology 21.2 (1972): 204.
2. The Relative Trap: Be
Absolute
Let’s start out with a quiz. Which of the lines is longer?
How about this: which circle is larger, the middle circle in the
left arrangement or the middle circle in the right arrangement?
Next we draw branches for the forecasts for each station. Station
one said it will rain. Since station one is correct 9/10 times, it
will forecast rain correctly 9/10 times if it will rain. However it
will be wrong 1/10 times and also forecast rain when it will not
rain.
A city only has blue and green cabs, of which 85% are
green and 15% are blue. A hit and run accident happens
during the night. A witness testifies to having seen a blue
cab. In similar conditions, the witness could correctly
identify the color 80% of the time and would fail 20% of
the time. What is the probability the cab was blue, rather
than green, given the witness said the cab was blue?
The gut instinct is to think the witness is 80 percent accurate.
That answer would be correct if there were an equal number of
blue and green cars. But since most cabs are green, that makes it
more likely the witness actually did see a green car and
misidentified it as blue.
In other words, we need to take the proportion of cabs into
account when making this calculation. It turns out that there is
only a 41% chance the witness who allegedly saw a blue car was
correct in the car being blue.
Here is the mathematical calculation. To start, we can imagine
the total population of cabs as blue and green.
Out of a 100 cabs, we would have 15 blue ones and 85 green
ones.
The next step is to figure out the proportion of cabs correctly and
incorrectly identified. The witness is 80% accurate, which means
80% of blue cars will be identified as blue, and 80% of green
cars will be identified as green. The remaining cars will be
incorrectly identified.
Of the 15 blue cabs, the witness correctly identifies 80%, which
is 12 cabs, and misidentifies the remaining 20%, or 3 cabs as
green.
Of the 85 green cabs, the witness correctly identifies 80%, which
is 68 cabs, and misidentifies the remaining 20%, or 17 cabs as
blue.
Finally, let us only focus on the cars that were identified as blue.
This comes from the blue cars correctly identified and the green
cars incorrectly identified.
There are a total of 29 cabs identified as blue, of which 12 were
blue and correctly identified, and 17 were green and
misidentified.
This means if the witness says a cab is blue, the cab actually is
blue in only about 12/29 ≈ 41% of the cases. The remaining
17/29 ≈ 59% of the cases are false positives, where the cab is
green and misidentified.
It is surprising that an accurate witness might not offer accurate
evidence. That is the negative way of viewing the result. The
positive aspect is the witness increased the likelihood of the car
being blue to 41%—which is nearly 3 times the base rate of
15%. The witness does matter, but the effect is not as strong as
people expect.
The problem is meant to illustrate that we should always place
the accuracy of information in relation to the circumstances of
the environment. Even an accurate witness will have a hard time
avoiding false-positives.
Note
[1] Tversky, Amos, and Daniel Kahneman. Evidential Impact of
Base Rates. No. TR-4. Stanford University Department of
Psychology, 1981.
Excessive Risk Aversion
The illusion: The fear of a loss can outweigh the high expected
value of a gain.
The rational response: Take gambles with a high expected
upside and avoid those with too much downside.
Examples
Would you rather have $1,000 for sure, or gamble with a coin
flip that you could get $2,000 or $0 with equal odds?
Rational agent theory supposes people are risk neutral. Since
both options have the same expected payout of +$1000, you
would be happy with either choice. But we prefer certain gains
over expected gains (we are risk averse), and that means we tend
to prefer the guaranteed $1,000 payout versus a gamble with the
same expected value.
Risk aversion is about taking a guaranteed outcome versus
facing a similar risky gamble. It is the amount of the guaranteed
outcome that indicates how risk averse someone is. Consider the
gamble of having a coin flip to get $2,000 or $0. How much
money would you rather have for sure instead of taking this
gamble?
Obviously you’d rather have $2,000 for sure, and $1,500 for
sure, and most likely even $1,000 for sure. These are all
situations where you’d be getting at least the expected value of
the gamble. This level of risk aversion is sensible and
understandable. What about $900? Now you probably have to
think. Accepting $900 means you are taking less than the
expected value. You have to weigh your options of winning big
versus going home with nothing.
We can repeat the exercise for smaller guaranteed amounts.
What about $800, or $500, or $100? At some point you would
not take the guaranteed money and you would opt for the
gamble. The smallest amount where you would take the gamble
over the guaranteed money is an indication of your risk aversion.
People with a higher level of risk aversion are willing to take
less guaranteed money for a given gamble.
The extreme case is someone who is infinitely risk averse. This
is a person who will accept even $1 over the coin flip of $2,000
or $0. In other words, infinite risk aversion considers worst-case
scenario.
If you view life as a series of gambles, then probability suggests
you should consider gambles with positive return (and low to no
risk of catastrophe). While you will lose some gambles, over
time you will win more and can expect a healthy gain.
Risk aversion is the idea that a bird in the hand is worth two in
the bush. This can be a good thing in leading us to conservative,
safe decisions. For instance, risk aversion is why we generally
prefer higher salaries over the chance of performance bonuses
and why we opt to pay insurance premiums to protect against
catastrophic loss.
But too much risk aversion can be bad as well. People who pay
for extended warranties on cell phones, electronics, and
treadmills often overpay for the relatively small risk of product
failure. Another example is people who hold too much cash and
bonds instead of investing in stocks which are risky but have
generally higher returns.
Winner’s Curse
The illusion: We overpay when competing for items with a
common value.
The rational response: Recognize when the winner’s curse
might happen and shave your bid so that you would not overpay
when you win.
Examples
Imagine an economics professor holds up a jar of quarters for
auction to a classroom. Each student secretly guesses the value
of the coins, and the professor will sell the jar for the highest bid.
What can we expect out of this auction?
If students are guessing honestly, typically the average of the
guesses will be very close to the actual value of the coins.
Averaging the guesses tends to average out errors and be
accurate, a phenomenon dubbed “the wisdom of crowds” [1].
This turns out to be bad news for the auction winner. If the
average guess is close to the true value of the coins, that means
the highest guess tends to exceed the true value of the coins. The
winning student has bid too much for the jar and will tend to lose
money.
You do not want to win this auction! That is the winner’s curse.
When the winner’s curse happens
When you bid for a house or shop on eBay, you usually have a
personal, private value to the item you want to buy. You can
safely evaluate your worth for an item and be sure to bid less
than that. You might overpay in the sense you could have paid
less, but you can avoid bidding more than you value the item.
The winner’s curse happens in auctions where the item has a
common value. The jar of coins, for example, is worth the same
monetary value to everyone. When you bid, you are guessing its
value, and the person who guesses too high will win and thereby
overpay for the coins.
Many auctions have common value components. Think about a
sports league auctioning off television broadcast rights. The
value of the broadcast depends on viewership level and interest
of advertisers, both of which are commonly valued by television
networks. Or think about a sports player in free agency. The
player’s expected contribution to each team is similar and like a
common value. And in fact, in these auctions, television stations
have been known to overpay for broadcast rights and teams have
been known to overpay for a star free agent.
In common value auctions, there is a selection bias that the
winner overestimates the value of the item, so the winner tends
to lose money on average.
One way bidders can avoid the curse is by bid shaving, which
means to reduce a bid to account for the risk of overpaying.
There are mathematical methods to calculate the percentage of
bid shaving, but it depends on many assumptions about other
bidders and their bidding strategy. In practice you might want to
reduce your bid by a percentage to avoid overpaying. The
students who bid on the jar of coins might reduce their guesses
by 20 percent, for example, to reduce the chance the winner
overpays.
Note
[1] Treynor, Jack L. “Market Efficiency and the Bean Jar
Experiment.” Financial Analysts Journal 43.3 (1987): 50-53.
5. Consistency And Social
Norms: Question The Rules
Which way are the arrows pointing in the following figure?
If you focus on the black space, the arrows are pointing to the
right. But if you focus on the white space, the arrows are
pointing to the left.
Many times in life we face situations where more than one
perspective is valid. In society there is often a benefit if everyone
agrees to a particular rule, and customs are borne out of
necessity for efficiency. If the arrows were directing traffic, for
example, you would want everyone to agree on a convention to
avoid collisions.
There are other arbitrary rules that become adopted for
efficiency. Business calendars around the world follow the same
solar calendar system that dates back to the Roman Empire.
Would our lives really be much different if we followed a lunar
calendar system instead, or devised a new calendar system?
Clearly there are benefits when we can agree upon a convention,
and over time the agreement becomes a tradition. It would be
very costly and difficult to change the calendar system now.
But does that mean the current system is the best one? Not
necessarily. Some traditions happen by luck or for historical
reasons. People mistakenly think social norms must be great
because they became the custom. There is a tendency to conform
to group behavior.
This chapter is about how you should pay more attention to the
rules of the game and see if you can make a better decision.
Many times people make suboptimal decisions for lack of
thought or for lack of courage to do something different.
In the arrow illusion, we recognize that multiple interpretations
are valid, and we allow people to appreciate the picture in
whatever way they like, or in both ways. When you spend your
money, you should be able to spend it in legal ways that bring
you happiness rather than living by sometimes contrived social
customs.
The Wrong Average
The illusion: The word “average” is often interpreted as being a
representative sample.
The rational response: Many measures of central tendency are
called “average.” Learn which measure is being used in each
setting and the limitations of each measure.
Examples
Here are some of the common statistics that are used when
people say “average.”
1. Simple average / arithmetic mean
An example of the simple average is the statement: “the average
adult weighs 180 pounds.” The simple average is the most
common meaning for the word average. The simple average is a
great way to summarize a pattern when individual data points are
in the same ballpark of each other (not too many outliers).
The simple average is calculated by adding up all the data points
and then dividing by the number of data points.
2. Median: the center point
People often say the word median, like “the median reported
income was $76,000.”
The simple average is flawed because extreme points can swing
the average away from the center. If 9 people owned a $76,000
home, except one person who owned a $1 billion home, the
simple average would be pushed up closer to $1 billion than to
$76,000, which is the value of most homes. The median offers
the middle point of the data so it will not be affected by a few
extreme values. It is most often used for distributions like
income and housing where extreme observations would bias the
average.
The median is calculated as the middle point in a ranked list in
order (or the average the two middle points if there are an even
number of data points).
3. Mode: the most prevalent
The mode is commonly used in advertising with claims like “our
brand was the most often preferred by dentists.” The mode
indicates the most popular choice. Which NBA jersey sold the
most? Which company did most new employees choose? These
are examples of the mode.
The mode is calculated as the data point with the highest
frequency.
4. Weighted average
Stock market indices are examples of a weighted average, like
“the S&P 500 closed at 2,013.” A simple average gives an equal
weight (1/n) to all data points (sample size n). The weighted
average allows you to give each data point a different weight.
Why do that? The idea is “important” data points can have more
influence. For instance, if you invest 90 percent in stocks and 10
percent in bonds, then your overall return will depend more how
on stocks than bonds. You can find your overall return by
multiplying your stock return by a 90% weight and then adding
your bond return multiplied by a 10% weight.
The weighted average is calculated by multiplying each
observation by a “weight” (between 0 and 100 percent, and all
weights add up to 100 percent), and then adding up the terms.
5. Geometric mean
If someone says their investments returned an average of 5%
annually over 10 years, the 5% annualized return is an example
of a geometric mean.
Here is an example to illustrate. Let’s say you invest $100. It
becomes $105 a year from now (5 percent return), and then
$115.5 in two years (10 percent return). What was your average
annualized return? You might be tempted to say 7.5 percent
which is the simple average. This is very, very close to, but not,
the correct answer. The reason why is that the second year return
is multiplied on top of the first year’s—there is a cumulative
effect. We need to use the geometric mean to correct for the bias.
The correct answer is 7.47 percent.
The geometric mean is a bit more complicated to calculate. If
there are n years of returns denoted by r2, r2, …, rn, the
geometric mean is:
[(1 + r1)(1 + r2)…(1 + rn)]1/n - 1.
The idea is we want to find a single growth rate, which if
compounded for n years, would give the same final answer as
what our investment actually did return.
6. Harmonic mean
The harmonic mean is useful to calculate the average of two
rates. For example, let’s say I drove to work at 60 miles an hour.
Due to traffic, my drive home averaged just 40 mph. What was
my average speed for the entire trip?
It’s tempting to say the arithmetic average of 50 mph. But this is
wrong because we need to account for the time and distance of
each trip: the trip home took more time, and hence it will have
greater impact on the overall average speed. The proper way to
calculate the average speed is to use the harmonic mean, which
is 48 mph.
The harmonic mean is the 1 divided by the sum of the
reciprocals multiplied by n. For two numbers x and y the
harmonic mean is 2/(1/x + 1/y) = 2(xy)/(x + y). For the speed
example, the harmonic mean is 2(2400)/(60 + 40) = 48 mph.
For n observations, the harmonic mean can be calculated from
the formula n/(1/x1 + … + 1/xn).
Priming Effect
The illusion: Seemingly irrelevant stimuli can influence how we
behave for another activity. We can be primed with information
that leads us to strange actions.
The rational response: Be more aware of your surroundings
and factors that might change your behavior. Use priming to
your advantage by thinking about saving money before making a
purchase.
Examples
Here is a fun experiment to try. Have a friend spell the word
POTS out loud, saying each letter, P, O, T, S quickly. Then ask
the friend to spell POTS out loud a few more times. Then ask:
what do you do at a green light? Most people instantly blurt out
“STOP.”
This is an example of the priming effect, a tendency to act based
on information already in the mind. A person who is spelling the
word POTS is repeatedly saying the letters that make up the
word STOP. When asked a question about a traffic light, the
natural response is to say STOP, even when everyone knows you
would GO at a green light.
Here is another question to test your mental math skills. Add up
the following numbers by reading the words out loud step by
step.
1000
20
30
1000
1030
1000
20
What did you get?
Most people, myself included, would say the answer is 5,000.
But do the math. The actual answer is 4,100. What happens is
each time you add a number, you get a result that starts with the
word “one-thousand.” At the very last step when you need to
carry over the result, the brain is primed to increase the
thousands value for the result of 5,000.
The priming effect may play a role in our purchase decisions.
Think about how many times people spend because they get the
idea that spending is appropriate behavior. One experiment
found using the word “sale” for items in a mail-order catalog
drove demand up by more than 50 percent, even if the “sale”
price was the same as the regular price [1]. The word “sale”
might prime you to think an item is a better bargain than it
actually is.
It is important to be aware of the priming effect since seemingly
irrelevant details might lead to automatic spending.
Note
[1] Anderson, Eric, and Duncan Simester. “Mind Your Pricing
Cues.” Harvard Business Review 81.9 (2003): 96-103.
Expensive Expectation
The illusion: We are willing to pay more if an item comes from
a place we expect to pay more.
The rational response: Think about the item and where it is
being consumed. Imagine buying the item at a regular store and
set that price as your willingness to pay.
Examples
How much are you willing to pay for a fountain soda in a food
court?
Many years ago I was at the National Mall in Washington DC.
My friends and I split up at the food court to get our favorite
items. I was about to add on a soda for $2, but the price seemed a
bit high. There were many restaurants around; might I get a
better deal?
I walked around and was surprised to see a lot of variation, with
the more expensive places generally charging more. That might
make sense when dining in a restaurant and paying for ambience
and service. In a food court everyone eats on the same food court
tables.
I found the lowest price at an ice cream shop. Perhaps it had to
discount or else few people would buy soda with ice cream.
Why didn’t more people comparison shop? Why were people
willing to pay more at the more expensive restaurants for pretty
much the same soda?
Research finds that people are willing to pay more if an item
comes from an expensive place. Richard Thaler demonstrated
this in 1985 using the following scenario [1]. He asked people to
imagine they were on a sunny beach. A friend would pick up
beer from town for them. What is the most they would pay for
their favorite brand of beer? The friend would only get the beer
if the price was below that amount.
The answer should depend on how much a person valued
drinking a cold beer on a sunny beach. It should not depend, for
example, on where the beer was purchased. Surprisingly people
were willing to pay an average of $2.65 if told the beer was
coming from a fancy resort hotel but only $1.50 if told the beer
was coming from a grocery store.
In both cases the person was enjoying a cold beer on a beach. If
the beer was $2, the data suggest people would feel the price is a
“rip-off” from the grocery store but a “bargain” from the fancy
resort hotel.
It is understandable beer would cost more in a fancy hotel. Hotel
bars charge for service, ambience, and their convenience value to
guests. But in the experiment the beer would be enjoyed on the
beach, so it shouldn’t matter where it comes from—the amount
you are willing to pay should be the value of drinking on the
beach.
Most of us have a sense of how much an item should cost, and
we should stick to that when determining our willingness to pay.
Note
[1] Thaler, Richard. “Mental Accounting and Consumer
Choice.” Marketing Science 4.3 (1985): 199-214.
Sunk Cost Fallacy
The illusion: We continue to spend time or money on a bad
option because we previously expended time or effort into that
option.
The rational response: Make the best decision looking forward.
Accept a previous bad decision was made, and then move on to
make better decisions.
Examples
A sunk cost is an expense that you cannot recover. You might
have spent a lot of time and money doing something but now
that effort has no value.
A rational agent makes the best decision going forward. What
options are available now, and what is the cost/benefit of those
decisions? Sunk costs are unrecoverable so they do not impact
marginal decision-making. In other words, sunk costs should be
ignored.
But time and again, we are emotional with our purchases and we
factor old memories when we make new decisions, known as the
sunk cost fallacy. Here are a few common and real-life examples
of the sunk cost fallacy.
1. “I came all this way, so I might as well buy something.”
I overheard this at an orchid shop. Some people were
disappointed they did not find anything they liked, particularly
because they had driven a long way to visit the shop. They felt
compelled to buy something since they had made the trip.
However, the time of the trip was a sunk cost and should not
have mattered: it was only relevant whether they wanted to buy
something of value.
The same tactic is perhaps used by outlet malls which are located
far away. Customers who drive long distances are not in the
mood to go home empty-handed, so they end up buying
something they may not really need.
2. “I don’t need to finish this beer—it was only $2. Let’s go to
the next place.”
My friend said this in a New York City bar. We were grabbing a
beer before dinner and found an incredible special of $2 pints,
about a third of the normal price. Some people finished their
beers early and were ready to go. My friend left his beer a
quarter-full, joking the beer was so cheap he was fine wasting it.
Of course, the cost of the beer was a sunk cost. It only matters if
he wanted to drink it or not (which he clearly did since he was
ready to drink more!).
3. “I need to get my value’s worth at a buffet.”
The cost of the buffet is a sunk cost. You should not base how
much you eat on how much you paid. Instead, you should base it
on whether you will enjoy the next morsel of food.
4. “I bought a voucher for yoga classes. Then my friend took
me to a yoga class and I didn’t like it. But I might as well
take the class since I paid for it.”
People often buy things on daily deals sites and then find they
actually do not care for the activity. The voucher should be a
sunk cost, but many people complete the activity because it is
pre-purchased.
5. “I fixed up my 10 year old car for $500. A week later I
discovered another problem that will cost $1,000 to fix. Since
I decided last week that I wanted to keep my car, I’m going
to pay for the repair.”
What was spent on the car before should not matter. The
decision to repair is supposed to be a forward-looking decision
that compares things like the cost of repair, cost of a new car,
and salvage value. Nevertheless, people often do too many
repairs to be consistent with a past decision to repair.
6. “I hate the treadmill we have at home, but I’m not gonna
pay for a gym membership.”
It doesn’t matter what you paid for a home gym. You should be
deciding on whether a gym membership is worth the cost versus
the benefits of exercising there. Too many people don’t exercise
because they don’t like their home equipment. Sell or get rid of
the old junk so you can avoid the sunk cost trap.
7. “Are you trying to rob me? I paid good money for this
lawn furniture when I bought it!”
I heard this at a garage sale where the owner was moving. The
buyer saw the owner was desperate and asked to haggle on price.
The lawn furniture was out of style and had dropped in price
since he bought it. But he was not willing to part with it because
of the price he paid, which was a sunk cost.
Sadly the market value for the furniture was not very good.
Better to sell it—which he did angrily—than to hold on to it
because of a high price paid before.
Money Comparisons
The illusion: We are afraid to be labeled “cheap” if we spend
money differently than others.
The rational response: Spend your money according to your
preferences.
Examples
Pain management is personal. Doctors generally ask patients to
rate their perceived pain on a scale of 1 (no pain) to 10 (highest
pain). Doctors do not simply rely on formulas and their own vast
experience of treating patients. The doctor lets the patient
convey their experience.
Why can’t we use that logic for managing our money? Managing
money should be personal. No matter what other people do with
their money, it is better that your own circumstances and
preferences guide your spending habits. This sounds obvious but
we all experience times when we make decisions by comparing
with our friends.
Years ago my friend was considering buying an iPod. We were
discussing the pros and cons. He was on the fence, but he was
leaning toward buying it. Here are some of the reasons he
offered:
“My friends love it and tell me I will find uses for it.”
3. Repeat the paired blind taste test again to test if you liked
the same brand two times in a row.
—2 points
—6 points
How would you answer?
Game theory
In decision theory, your decision to buy an extended warranty or
lease a car is an individual choice. You weigh the costs and
benefits of the decision and choose whether to do it.
Most decisions in life are not so simple. They also depend on
what others will do. Your choice to buy Apple products depends
on whether others will buy them too, which will lead to a larger
market for accessories, more demand for apps, and more hotels
and cars having iPhone compatible chargers. You might actually
prefer Android devices but you find Apple products are a better
decision overall for this reason. Your choice depends on what
you do as well as what others do.
Game theory studies such situations of interdependent decision-
making. Game theory helps you make the right decisions
because you can identify the strategic incentives. And ultimately
the knowledge can help you design better mechanisms so
everyone can benefit.
The Nash equilibrium is a common way to evaluate the solution
of a game. It happens when each person is making the best
choice relative to what everyone else is doing. No one can
individually change and profit. So let’s return to the exam
question.
Your logical thought process on the test
Many people think, “I’ll just take 2 because if everyone does this
we all get 2.” But that’s not a sufficient analysis of the game.
You have to think about your best choice relative to what others
can do. If everyone else is picking 2, why wouldn’t you pick 6
and get extra points?
So let’s analyze the problem carefully. We’ll consider your
choices and they relate to what others might do.
You have two choices: you can pick 2 or you can pick 6. The
result depends on what other people do. The crucial detail is how
many people are picking 6. So let’s categorize the actions of
everyone else in that regard. There are essentially two main ways
that everyone else can pick.