You are on page 1of 21

Thinking, Fast and Slow – by Daniel Kahneman

Thinking Fast and Slow is an eye-opening book jam-paced with useful content. Daniel
Kahneman even won a Nobel Prize in 2002 for his work. The book refers to the interactions
between what he terms ‘system 1’ and ‘system 2’ thinking.

System 1 is the ‘fast’ thinking. It’s what we may think of when we hear the word ‘intuition’.
System 1 is constantly forming a view of the world around it and looking for any subtle
changes. It is gullible and always seeking to make things true, so if System 2 is pre-occupied,
we can be fooled into believing things that aren’t true.

System 2 is the ‘slow’ thinking. It is effortful, deliberate, structured thinking. System 2 is lazy,
so prefers to adopt System 1 conclusions if it means it doesn’t have to work. But we use
system 2 for any difficult or extended tasks that require a lot of brain power.

This is a long book, so we covered our favourite chapters that we thought were most
applicable to everyday life. It’s well worth the read because it will show you the flaws in our
thinking and the moments in which we can be fooled by our own mind into believing things
that aren’t actually true.

If you like this book, you might enjoy some of our other episodes about books on
psychology and behavioural economics, like: Predictably Irrational by Dan Ariely, Thinking in
Bets by Annie Duke, or The Black Swan by Nassim Taleb.

Grab a copy of the book here: https://www.bookdepository.com/Thinking-Fast-and-Slow-


Daniel-Kahneman/9780141033570/?a_aid=adamsbooks

Thinking Fast and Slow (part 1):

Thinking Fast and Slow (part 2):

Thinking, Fast and Slow

Here is a summary of the book!

A deeper understanding of judgements and choices requires a richer vocabulary than is


available in every day language.
This book looks at human irrationality and vulnerabliity to biases through the lens of two
systems, System 1 and System 2 thinking.

System 1 

Operates automatically and quickly with little or no effort, and no sense of voluntary
control. It effortlessly originating impressions and feelings that are main sources of the
explicit beliefs and deliberate choices of system 2.

For example, it can:

 Detect that one object is more distant


 Detect hostility in a voice
 Drive a car on an empty road
 Find a strong move in chess
 Answer 2 + 2

System 1 has learned associations between ideas (capital of France e.g) and it has learned
skills like reading and understand nuances of social situations.

System 2 

Allocates attention to the effortful mental activities that demand it, including complex
computations. The operations of System 2, are often associated with the subjective
experience of agency, choice and concentration

Examples

 Brace for the starter gun in a race


 Focus attention on clowns in a circus
 Look for a woman in white hair
 Tell someone your phone number
 Fill out tax
 Park in a narrow space

In all the situations you must pay attention and you will perform less well or not at all, if you
are not ready or if your attention isn’t directed appropriately.

The often used phrase “pay attention” is apt, you dispose a limited budget of attention that
you can allocate to activities, and if you go beyond your budget you will fail.
 

ABOUT SYSTEM 2

System 2 Takes Effort

Defining feature of system 2 is that it’s operations are effortful and one of its main
characteristics is laziness. A reluctance to invest more effort in than necessary. This is a big
part of Thinking Fast and Slow.

Much like the electricity meter outside your house or apartment, the pupils offer an index of
the current rate at which mental energy is used. The analogy goes deep. Your use of
electricity depends on what you choose to do, whether light a room or toast bread. When
you turn on the toaster, it takes all of the energy it needs and no more. Say if you were told
to hold the 4 digits 9462 and your life depends on holding it for 10 seconds . No matter how
much you want to life, you can exert as much effort in this task

As you become skilled in a task, its demand for energy diminishes. In the economy of action,
effort is cost and the acquisition of skill.

A general law of least effort applies to cognitive as well as physical exertion. It asserts that if
there are several ways of achieving the same goal, people will gravitate towards the least
demanding course of action. One of the significant discoveries of cognitive psychologists in
recent decades that switching from one task to another is effortful, especially under time
pressure.

System 2 Gets Depleted

It is easy and quite pleasant to walk and think at the same time, but at extremes these can
be competing for the limited resources of system 2

You can confirm this claim by a simple experiment. Ask someone to keep walking and
compute 23x 78. They will almost certainly stop walking to think.

Evidence suggests that you are more likely to eat tempting chocolate cake when your mind
is loaded with digits.

System 1 has more influence on behaviour when system 2 is busy. People who are
cognitively busy are more likely to make selfish choices, use sexist language and make
superficial judgements in social situations
Self control requires attention and effort.

Another way of saying this is that controlling thoughts and behaviours is one of the tasks
that system 2 performs

The indications of depletion is highly diverse

 Deviating from ones diet


 Overspending on purchases
 Reacting aggressively to provocation

Study – “Proceedings of the National Academy of Sciences”

This study looked at 8 parole judges in Israel who spend entire days reviewing applications
of patrol

The cases are presented in random order and judges spend about 6 minutes each time. For
the judges, default decision is denial, with only 35% approved

The exact time of each was recorded – and the times of the judge’s food breaks. They found:

 Proportion of approved requests spikes against the time of the last food break
 After a meal 65% are granted
 Then it drops steadily to just about 0

The amazing conclusion was:

‘tired and hungry judges tend to fall back on the easier default position of denying requests
for parole

ABOUT SYSTEM 1

The main function of System 1 is to maintain and update a model of your personal world
which represents what is normal. The model is constructed by associations that link ideas of
circumstances, events, actions and outcomes that co-occur with some regularity, either at
the same time or within a relatively short interval. As these links are formed and
strengthened, the pattern of associated ideas comes to represent the structure of the events
in your life, and it determines your interpretation of the present as well as your expectations
in the future

“How many animals of each kind did Moses take into the ark?”

The number of people who detect what is wrong in this question is so low it’s called the
Moses illusion. The idea of animals going into the ark sets up a biblical context, and Moses is
not abnormal in this context. You did not expect him, but mention of his name was not
surprising. Replace Moses with George Bush and you will have a poor political joke, but no
illusion

System 1 Jumps to Conclusions

Jumping to conclusions is efficient if the conclusions are likely to be correct and the cost of
an occasional mistake is acceptable, and if the jump saves much time and effort. Jumping to
conclusions is risky when the situation is unfamiliar, the stakes are high and there is no time
to collect more information. These are situations in which intuitive errors are more probable
which may be prevented by a deliberate intervention of System 2 . This is a core element of
Thinking Fast and Slow.

System 1 does not keep track of alternatives that it rejects, or even of the fact that there
were alternatives. Conscious doubt is the repertoire of System 2. It requires maintaining
incompatible interpretations at the same time, which demands mental effort

What You See is All There Is (WYSIATI)

Consider the following:

“Will Mindik be a good leader? She is intelligent and strong”

The answer came quick – it was yes. You picked it based on very limited information. What if
the next two pieces of information were corrupt and cruel?

You did not start by asking “what would I need to know before I formed an opinion about
the quality of someone’s leadership?
Their research shows that participants who see one sided evidence are more confident of
their judgements than those who saw both sides. WYSIATI facilitates that achievements of
coherence and of the cognitive ease that causes us to accept the statement as true

It leads to a range of biases:

Overconfidence: 

Neither the quality or quantity of evidence counts for much in subjective confidence. Our
cognitive system suppresses doubt and ambiguity

Framing effects: 

Different ways of presenting the same information invoke different emotions

“The odds of survival after one month are 90%” is more reassuring “mortality within one
month is 10%”

Or 90% fat free, compared to 10% fat

Base rate neglect: 

If the personality description is salient and vivid, the statistical fact almost certainly doesn’t’
come to your mind. If it has a good story to it, it doesn’t matter about the statistical
probability

System 1 Answers an Easier/Different Question

The technical definition of heuristic is a simple procedure that helps find adequate, though
often imperfect answers to difficult questions. People simplify the impossible task  – when
asked to judge probability, people actually judge something else and believe they judge
probability

E.g Target question asked / heuristic question answered

How much would you contribute to save an endangered species? / How much emotion do I
feel when I think of dying dolphins?

How popular will the president be in 6 months? / How popular is the president right now?
These all make up key ideas that relate to Thinking Fast and Slow.

HUMAN BIASES

LAW OF SMALL NUMBERS

A study of kidney cancer in 3141 counties of the USA reveals a remarkable pattern. The
countries where it is lowest are mostly rural, sparsely populated and located. Most would
easily think to infer that the low cancer rates are direct result of clean living of the rural
lifestyle – no air pollution, no water pollution, access to fresh food without additives etc

Now consider the counties in which the cancer rates are highest. These are rural, sparsely
populated and republican states. It is easy to infer the high cancer rates directly due to the
poverty of rural areas. No access to quality healthcare, a high-fat diet and too much alcohol
or tobacco. The rural lifestyle cannot explain both very high and very low incidence of
kidney cancer

Imagine an urn filled with marbles

 Half are red half are white


 Large samples are more precise than small samples
 Small samples yield extreme results more often than large samples do

The associative machinery seeks causes and to weave a narrative.

Hot hands 

We have misperceptions of randomness in basketball. The fact that players occasionally


acquire a hot hand is generally accepted by players, coaches and fans. The inference is
irresistible, they score 3 or 4 in a row and you cannot help form the causal judgement that
the player is now hot, with a temporary increased propensity to score. Analysis of thousands
of sequences of shots led to a disappointing conclusion. There is no such thing as a hot hand
in professional basketball.
Some players are more accurate, but the sequences of successes and missed shots satisfies
all the tests of randomness. The hot hand is a widespread cognitive illusion.

Statistics produce many observations that appear to beg for causal explanations but do not
lend themselves to such explanations. Many facts of the world are due to chance. Causal
explanations of chance events are inevitably wrong.

ANCHORS

Daniel once rigged a wheel of fortune to be 10 or 65, then asked two questions. This was
before he wrote the book Thinking Fast and Slow.

Is the % of African nations among UN members larger or smaller than the number you
wrote?

What is your best guess of the % of African nations in the UN?

Surely the wheel wouldn’t effect the answers?  … But

The average estimates of those who saw 10 and 65 respectively were 25% and 45% !

This phenomenon is called the anchoring effect. There is a form that is a deliberate process
of adjustment, an operation for system 2, and there is an anchoring effect that occurs by a
priming effect, an automatic manifestation of system 1. It explains why you are going to
drive faster moving from a Freeway onto smaller streets, or if a kid turns down the music in
the room from an exceptionally high place, the ‘reasonable’ volume will be much higher due
to the original anchor.

System 1 understands sentences by trying to make them true and the selective activation of
compatible thoughts produces a family of systematic errors that make us gullible and prone
to believe too strongly whatever we believe System 1 tries its best to construct a world in
which the anchor number is the true number. This is one of the manifestations of
associative coherence

Random anchors 
The power of random anchors has been demonstrated in unsettling ways. German judges
with an average of more than 15 year of experience first read a description of a woman who
had been caught shoplifting, then rolled a pair of dice that was loaded to 3 or a 9.

As soon as the dice were came to a stop, the judges were asked who long should the
sentence be

 Those who had rolled a 9 said they would sentence her to 8 months
 Those who rolled a 3 would sentence to 5 months
 The anchoring effect was 50%

Use and abuse of anchors

Anchoring effects explain why arbitrary rationing is an effective marketing ploy.

At a supermarket they had a promotion for Campbell’s soup at 10% off regular price

 Some days it had limit of 12 per person


 Some days it had no limit per person

Shoppers purchased an average of 7 cans when the limit was in force, twice as much!

Negotiating in a bazar 

When you negotiate for the first time in Bali, the initial anchor has a powerful effect. If the
other side has made an outrageous proposal, you should not come back with an equally
outrageous counteroffer, creating a gap that will be too difficult to bridge in further
negotiations. Instead you should make a scene, storm out or threaten to do so – make it
clear to yourself and the other side – that you will not continue the negotiation with that
number on the table

AVAILABILITY BIASES

The availability heuristic from Thinking Fast and Slow


Like other heuristics of judgement, substitutes one question for another: you wish to
estimate the size of a category or the frequency of n event, but you report an impression of
the ease in which instances come to mind.

A plane crash that attracts media coverage will temporarily alter your feelings about the
safety of flying. Accidents are on your mind for a while, after you see a car burning on the
side of the road you conclude the world is a more dangerous place

Study into contribution to peace in marriages

When people are asked:

“How large was your personal contribution to keeping the place tidy, in percentages?”

The self estimated contributions add to more than 100%. The explanation is a simple
availability bias:

 Both remember their own individual efforts and contributions much more clearly than
those of the other, and the difference in availability leads to a difference in judged
frequency
 Same bias when collaborative teams are asked to mention the contribution of their
share

ASSOCIATIVE COHERENCE

Benefit and risk (associative coherence) 

There is a high negative correlation between level of benefit and risk they attributed to
technologies (they should be somewhat mutually exclusive… Like nuclear power has high
benefit high risk…)

When people are favourably disposed towards a technology, they rated it as offering large
benefits and imposing little risk. When they disliked a technology, they could think only of its
advantages, and few advantages came to mind.

“The emotional tail wags the rational dog” – Jonathan Haidt


The affect heuristics simplifies our lives by creating a world that is much tidier than reality.
Good technologies have fewer costs in the imaginary world we inhabit, bad technologies
have no benefits and all decisions are easy

AVAILABILITY CASCADE

This was a really intriguing idea presented in the book Thinking Fast and Slow by Daniel
Kahneman. The importance of an idea is often judged by the fluency (and emotional charge)
with which the idea comes to mind. An availability cascade is a self-sustaining chain of
events which may start from media reports of a relatively minor event, lead up to public
panic and large scale government action

 On some occasions, a media story about a risk catches the attention of a segment of the
public which becomes aroused and worried
 The emotional reaction becomes a story in itself, prompting additional coverage in the
media, which in turn produces greater concern and involvement
 The cycle is sometimes sped deliberately by availability entrepreneurs individuals, or
organisations who work to ensure continual flow of worrying news
 The danger is increasingly exaggerate as the media compete for attention grabbing
headlines
 Scientists and others who try to dampen the fear attract little attention, most of it
hostile
 The issue becomes politically important because it’s on everyone’s mind, and the
response of the political system is guided by the intensity of public sentiment
 Other risks, and other ways the resources could be applied for public good all have
faded into the background

PROBABILITY NEGLECT

We have the inability to deal with small risk. We either ignore them all together or give
them far too much weight – nothing in between. The amount of concern is not adequately
sensitive to the probability of harm, you are imagining the numerator – the tragic story you
see on the news – and not thinking about the denominator

The combination of probability neglect and the social mechanisms of availability cascades
leads to gross exaggeration of minor threats, sometimes with important consequences.

PROBABILITY NEGLECT + AVAILABILITY CASCADE = TERRORISM OPPORTUNITY


Terrorists are the best practitioners of inducing availability cascades. The number of
causalities from terror attacks is very small relative to other causes of deaths. It doesn’t
come close to coronary heart disease, but the gruesome images endlessly repeated in the
media is readily available.

Terrorism speaks directly to system 1 from Thinking Fast and Slow.

BASE RATE NEGLECT from Thinking Fast and Slow

Here is a simple puzzle from Thinking Fast and Slow:

“Tom W is a graduate student. Rank what he is from 1 – 6”

 Business admin
 Computer science
 Engineering
 Law medicine
 Library science
 Physical science

What is most likely?

The question is easy. You knew immediately the relative size of enrollment in the different
feilds as the key to the solution. To decide whether the marble is green or red, all you need
to know is how many marbles of each colour are in the urn

Next comes a task that has nothing to do with base rates:

Tom is high intelligent, lacking creativity. Needs order and clarity. Writing is dull. He corny.
Has little sympathy. Doesn’t enjoy interacting with others. But has a deep moral sense

The narrative skews what everyone chooses as his profession, to computer science or
engineering – you probably did as well.

Tom W was intentionally designed as an anti base rate character, a good fit to small fields
and poor fit to the most populated specialties.

System 1 generates an impression of similarity without intending to do so. Although it is


common, prediction by representativeness is not statistically optimal. Michael Lewis’s
Moneyball is about the inefficiency of this mode of prediction. Professional baseball scouts
traditionally forecast the success of possible players in part by their build and look. Billy
Beane made the unpopular decision to overrule his scouts and to select players based on
statistics instead of past performance. The players the A’s picked were inexpensive, because
other teams had rejected them by not looking the part. The team got excellent results at
low cost

For example

You see a person reading the New York times on the New York Subway… Which is a better
bet?

 She has a PhD


 She does not have a college degree

Representativeness would tell you to bet on the PHD. But the second is a better option,
there are many more non graduates than PhDs who ride NY subways

If you are asked about a woman who is a shy poetry lover, studies Chinese Literature or
Business Admin, choose business admin.

CONJUNCTION FALLACY

Linda is 31, single, outspoken, bright, majored in philosophy, deeply concerned with issues
of social justice and took part in anti nuclear demonstrations. 

 Linda is a teacher in elementary school


 Linda works in a bookstore and takes yoga classes
 Linda is active in the feminist movement
 Linda is a psychiatric worker
 Linda is a bank teller
 Linda is an insurance salesperson
 Linda is a bank teller and is active in the feminist movement

Is Linda a bank teller… Or more like a bank teller as part of the feminist movement?

Everyone agrees Linda fits the category of feminist bank teller better than that of bank
tellers. Adding that detail to the description makes for a more coherent story

This is a failure of system 2. Participants had the fair opportunity to detect the relevance
ofa  logical rule, since both outcomes were included in the same ranking
 89% of undergraduates violated the logic of probability
 You’d think those in Stanford graduate class would do better
 But 85% of them also said feminist bank teller more likely than bank teller

This was very controversial as all the famous university students were fucking it up.

REGRESSION TO THE MEAN

Danny was once training air force that rewards improve performance better than
punishment of mistakes

Someone from the crowd disagreed. On many occasions I’ve praised flight cadets for clean
execution of a aerobatic maneuver. The next time, they usually do worse… On the other
hand I scream at cadets after bad execution, and they usually do better. What he observed
actually, was regression to the mean

It was due to random fluctuations in the quality of performance. Naturally he commended it


after it was better than average, but the cadet was lucky to get so far above average, and
there was only downside. The officer than praised him, cadet performed worse, and the
officer weaved a story about the effect of his comment. The more extreme the original
score, the more regression we expect, because an extremely good score suggests a very
lucky day

HALO EFFECT

Good stories provide a simple and coherent account of people’s actions and intentions. You
are always ready to interpret behaviour as a manifestation of general propensities and
personality traits – causes that you readily match to effects. If we think the baseball pitcher
is handsome – we are likely to rate him better at throwing the ball

Halos can also be negative.

The halo effect helps keep explanatory narratives simple and coherent by exaggerating the
consistency of evalutions: good people only do good things and bad people are all bad.

The statement “Hitler loved dogs and little children” is shocking no matter how many times
you hear it, because any trace of kindness in someone so evil violates the expectations of
the halo effect.
 

NARRATIVE FALLACY

Nassim Taleb in the Black Swan introduced the narrative fallacy to describe how flawed
stories of the past shape our views of the world and our expectations of the future.
Narrative fallacies arise inevitably from our continuous attempt to make sense of the world.
The explanatory stories that people find compelling are simple, are concrete rather than
abstract, assign a larger role to talent, stupidity, and intentions rather than to luck, and
focus on a few striking events that happened rather than the countless events that failed to
happen.

A compelling narrative fosters an illusion of inevitability. Consider how google turned into
the giant of the technology industry. Two creative graduate students in the computer
science department at Stanford came up with a superior way of searching on the internet. A
detailed history would specify the decisions of the founders, but for our purposes it suffice
to say every choice they made had a very good outcome. A more complete narrative would
describe the actions of the firms that google defeated

The competitors would appear blind, slow and inadequate. This is a simple good story. IF
you flesh it out, you could probable write a book on what made google succeed, with very
valuable business lessons

Unfortunately there is good reason to believe your belief in this story is largely illusory. The
ultimate test of an explanation is whether it would be predictable in advance. The human
mind does not deal well with non events. The fact that many of the important events that
did occur involve choices further tempts you to exaggerate the role of skill and
underestimate the part that luck played in the outcome

At work here is that powerful WYSIATI rule

 You cannot help dealing with the limited information you have as if it were all there is
to know
 You build the best possible story from the information available to you, and if it is a
good story, you believe it

Paradoxically it is easier to construct a coherent story when you know little, when there are
fewer pieces to fit into the puzzle.

HINDSIGHT BIAS
According to the book Thinking Fast and Slow, many psychologists have studied what
happens when people change their mind.

Choosing a topic on which minds are completely not made up – say the death penalty – the
experimenter can measure attitudes

 Then they get persuaded with pro or con


 Then they measure the attitudes again, they usually are somewhat persuaded
 Then asked to give the opinion from beforehand, which is actually very difficult

When asked to reconstruct former beliefs, they retrieve current ones instead.

Your inability to reconstruct past beliefs will inevitably cause you to underestimate the
extent to which you were surprised by past events. It’s also known as “I knew it all along
effect”, or hindsight bias.

Hindsight bias has pernicious effects on the evaluations of decision makers. It leads
observers to assess the quality of a decision not by whether the process was sound but by
whether its outcome was good or bad. Consider a low risk surgical intervention, in which an
unpredictable accident caused the patients death. The jury after the fact, will believe the
operation was risky

Hindsight is especially unkind to decision makers who act as agents for others – like CEOs of
coaches. We are prone to blame decision makers for good decisions that worked out badly,
and give them too little credit for successful moves that appear obvious only after the fact.

FORMULAS ARE BETTER THAN INTUITION in Thinking Fast and Slow

It is generally agreed that the effect vintage can be due to variations in weather during the
grape growing season

Best wines are usually when summer is warm and dry and it also helped by wet springs

A scientist converted the knowledge into  a statistical formula that predicts the price of a
wine – at any particular age – by the weather, average temp of the growing season and the
amount of rain at harvest time, and the total rainfall at the previous winter His formulas
predict much better than all the experts do.

Why are experts inferior to algorithms?


Experts try and think outside the box and make complex combinations of features.
Complexity may work, but it often reduces validity, simple combinations are usually better.
Several studies show that human decision makers are much worse than formulas.

Experienced radiologists who evaluate X-rays as normal or abnormal contradict themselves


20% of the time. The widespread inconsistency is probably due to extreme context
dependancy of System 1.

We know from studies about priming that are unnoticed has a substantial influence on our
thoughts (they could have heard about someone dying of breast cancer on a podcast the
day before).

Research shows that to maximise predictive accuracy, final decisions should be left to
formulas – especially in low validity environments

Example – for marital stability (Dawes formula)

Marital stability = Frequency of love making – frequency of quarrels

This comes form the book Thinking Fast and Slow

PLANNING FALLACY  from Thinking Fast and Slow

Planning fallacy is – a bias of project planning that:


 Are unrealistic close to best case scenarios
 Could be improved by consulting the stats of similar cases

E.g – in July 1997 the Scottish parliament estimated hte building of Edinburgh would cost 40
million pounds

 By June 1999, the budget was then 109 million


 In April 2000, the regulators put a 195 million “cap on costs”
 By November 2001, they demanded an estimate of hte final cost , which was set to 241
million
 The estimated cost rose twice in 2002, and ended at 294.6 milllion
 It rose 3 more times in 2003, reaching 375 million by June
 Building was completed for 431 million in 2004

Or e.g kitchens:

Survey of home owners who redo their kitchens predict $18,658 but in fact pay 38,769

The greatest responsibility of avoiding the planning fallacy lies with the decision makers who
approve the plan. If they do not recognize the need for an outside view, they commit a
planning fallacy

Taking the outside view (another’s opinion with experience) is the cure to the planning
fallacy.

PROSPECT THEORY

You are offered a gamble on the toss of a coin

If the coin shows tails, you lose 100

If it shows heads, you win 150

Is this attractive?

For this you need to balance the psychological benefit of getting $150 against the cost of
losing $100. For most people, the fear of losing $100 is greater than the hope of gaining
$150.

You can measure your loss aversion by asking yourself, what is the smallest gain to balance
an equal chance of losing $100.

Good poker players need to be close to 1:1 and exploit the other players who are
asymmetric.

LOSS AVERSION & ENDOWMENT EFFECT


In labour negotiations, it is well understood by both sides that the reference point is the
existing contract and that the negotiations will focus on mutual demands for concessions
relative to the reference point. The role of loss aversion in bargaining is also well
understood: making concessions hurts. If you move jobs or locations, you remember the
new features minus that of the old. Loss aversion gives preference to the status quo.

Suppose you had a ticket for a rock concert of a sold out – your favourite band:

 You buy it at the regular price of $200, which was tough


 But then an avid fan offers $500 a few days before hand
 Would you sell?

Most people lowest selling price is about $3000.

SUNK COST FALLACY

The emotions that people attach to the state of their mental accounts are not
acknowledged in standard economic theory. If you buy a movie ticket and it sucks, the fact
that you pay for it shouldn’t matter.. You should just walk out.

This is the sunk cost fallacy.

A rational decision maker is interested only in the future consequences of current


investment, justifying earlier mistakes shouldn’t be of concern.

Imagine a company that has spent 50 million on a project. The project is now behind
schedule and already spent 50 million , but an additional investment of 60 million is
required. It should be considered a 60 million project now, and cancelled if it’s actually
going to bring less than 50 million return.

The sunk cost fallacy keeps people for too long in poor jobs, unhappy marriages and
unpromising research projects.

Thinking Fast and Slow: Two Selves

EXPERIENCING SELF VS THE REMEMBERING SELF

Experience and memory – how it the idea of ‘the two selves’ relates to Thinking Fast and
Slow
How much can experienced utility be measured? A British economist measured this with the
hedonometer

 Kept buzzing at different times of the day to ask how the moment is

Study also of painful colonoscopy

Patient A

Quick painful experience

Patient B

Same intensity and length of pain… But towards the end slowly diminishes down

When procedure was over, all were asked how was the pain and surprisingly it didn’t equal
the hedonometer totals

The analysis revealed two findings:

Peak end rule: 

The retrospective rating was predicted by the average between the worst moment and the
final moment.

Duration neglect: 

Duration of the procedure had no effect whatsoever on the ratings of total pain

If the objective is to reduce the patients memory of pain, addiing additional pain at the end
that trails off would be a good strategy. If it is to reduce actual pain experience, do it quickly
(but they’ll have an awful memory)

The experiencing self asks – “does it hurt right now?”

The remembering self saks “how was it, on the whole?”


 

Thinking Fast and Slow

That’s our major takeaways from the legendary book: Thinking Fast and Slow.

You might also like