You are on page 1of 8

1.

Action Bias 
The action bias was once evolutionarily adaptive and, as such, the impulse to act is
hardwired in us as a means of survival. Patterns of reinforcement and punishments
experienced throughout our lives cause us to continue to engage in this behavior.
Other factors contribute to the maintenance of the action bias, including prior
experiences where inaction caused us to fail. Additionally, overconfidence in our
ability to predict a favorable outcome and our desire to feel in control over our
circumstances both lead us to engage in the action bias with greater frequency.

Example 1 – Difficult diagnoses


When meeting with a patient who presents with unusual symptoms that are difficult
to diagnose, but don’t seem to be posing any immediate threat to their wellbeing,
doctors tend to engage in the action bias by choosing to run a full workup, rather
than scheduling a follow-up appointment. 
Example 2 – Investing
Factors such as panic, overconfidence, and a desire for control can lead us to make
poor decisions when it comes to our investments. It can result in us over-trading or
selling low. These decisions result from the action bias, as we feel compelled to do
something, instead of patiently working towards a future goal.

2. Affect Heuristic
The affect heuristic can be explained by dual process theory, which states that we
have two distinct cognitive systems for decision making, one that is automatic and
one that is effortful. The affect heuristic is a product of the automatic system, as it
arises from our affective state. Our emotions can also alter our perception of the risks
and benefits of a certain outcome, which is another factor that leads to this heuristic.
Example 1 – Fear appeals
Public health campaigns have used the affect heuristic to deter people from engaging
in unhealthy behavior by sharing scary or disturbing information. Anti-smoking
campaigns, for example, lead to information about the consequences of smoking and
pictures of diseased gums and lungs being added to cigarette packages in Canada. A
survey found that the more negative emotions people felt in response to these
warning labels, the more likely they were to cut back on their smoking or even quit
altogether.
Example 2 – Interpreting statistics
Statistics presenting the probability of a certain event occurring have the ability to
elicit emotional responses from us. These responses can be manipulated by the way
the information is framed. For example, clinicians in a study were less willing to
discharge a psychiatric patient when told that 20 out of every 100 individuals like
him committed a violent act in the six months after being discharged than were
clinicians who were told that 20% of patients like him acted violently in that same
time frame. This is because the relative frequency, or percentage, brought to mind
the image of an individual who had low odds of behaving violently, while the
equivalent frequency, 20 out of 100, brought to mind the image of several people
committing violent acts. The latter elicited more negative affect in the clinicians,
thereby making them less willing to discharge the patient.

3. Anchoring Bias
There are two dominant theories behind anchoring bias. The first one, the anchor-
and-adjust hypothesis, says that when we make decisions under uncertainty, we start
by calculating some initial value and adjusting it, but our adjustments are usually
insufficient. The second one, the selective accessibility theory, says that anchoring
bias happens because we are primed to recall and notice anchor-consistent
information.

Example 1 – Anchors in the courtroom


In criminal court cases, prosecutors often demand a certain length of sentence for the
accused. Research shows these demands can become anchors that bias the judge’s
decision making.
Example 2 – Anchoring and portion sizes
The common tendency to eat more when faced with a larger portion might be
explained by anchoring. In one study, participants’ estimates of how much they
would eat were influenced by an anchoring portion size (large or small) they had
been told to imagine previously.

4. Availability Heuristic
The brain tends to minimize the effort necessary to complete routine tasks. When
making decisions — especially ones involving probability — certain memories and
knowledge jump out to replace the complicated task of calculating statistics. Some
memories leave a lasting impression because they connect to emotional triggers.
Others seem familiar because they align with the way we process the world, such as
recognizing words by their first letter.

Example 1 – Lottery winners


One buys lottery tickets because the lifestyle that follows a winning ticket comes to
mind easily and vividly, while the probability of winning is a complex calculation that
does not jump out while one is at the ticket counter.
Example 2 – Drug use and the media
Sensational news stories seem much more likely to occur than unremarkable (yet
dangerous) activities. The availability heuristic skews the distribution of fear towards
events that leave a lasting mental impression due to their graphic content or
unexpected occurrence versus comparatively dangerous yet more probable events.

5. Bandwagon Effect
As an idea or belief increases in popularity, we are more likely to adopt it. The first
reason for this, is that the bandwagon effect serves as a heuristic by allowing us to
make a decision quickly. We skip the long process of individual evaluation and rely
on other people to do it for us. Because widespread popularity as a sign many people
were in favour of an idea or behavior, we also decide to adopt it. Second, to avoid
standing out and being excluded as a result, many of us support the behavior or ideas
of a group we find ourselves in. Third, we accept the majority opinion because we
want to be on the ‘winning side.’ It may be the case that we have evolved to
instinctively support popular beliefs because standing against the tide represented by
the majority can be disadvantageous at best and dangerous at worst.

Example 1 – Snowballing political campaigns


The bandwagon effect is thought to influence political elections as voters are drawn
to parties or candidates that they perceive as being popular and therefore likely to
win the election. A 2017 study done by German researchers looked into this
relationship by studying the effects of polling information on voter perceptions
surrounding a fictitious mayoral election. The results supported the influence of
bandwagon effect, as polling information (ie. perceived popularity) had a strong
influence on whether or not participants expected a candidate to win or not.
Example 2 – Historical influence on medicine
The bandwagon effect can influence the decisions made by doctors. Many medical
procedures that have been widely practiced for periods in history have subsequently
been disproven. Doctors’ widespread use and support of them can be attributed to
their popularity at the time. Tonsillectomy is cited as a recent example of medical
bandwagons. Although the practice is said to be beneficial in some specific cases,
scientific support for the universal use it saw was lacking. Doctors were drawn to
tonsillectomy not on the basis of its effectiveness, but because they saw it was widely
used.

6. Base Rate Fallacy


There are multiple factors that contribute to the occurrence of the base rate fallacy.
One is the representativeness heuristic, which states that the extent to which an
event or object is representative of its category influences our probability judgments,
which little regard for base rates. Another is relevance, which suggests that we
consider specific information to be more relevant than general information, and
therefore selectively attend to individuating information over base rate information.
Example 1 – The cab problem
A classic explanation for the base rate fallacy involves a scenario in which 85% of
cabs in a city are blue and the rest are green. One night, a cab is involved in a hit and
run accident. A witness claims the cab was green, however later tests show that they
only correctly identify the color of the cab at night 80% of the time. When asked what
the probability is that the cab involved in the hit and run was green, people tend to
answer that it is 80%. However, this ignores the base rate information that only 15%
of the cabs in the city are green. When taking all the information into consideration,
crunching the numbers shows that the likelihood that the witness was correct is
actually 41%.
Example 2 – How much will you donate?
Participants in a study were asked how much out of the five dollars they were given
would they donate to a given charity. They were asked to make the same prediction
about their average peer. Next, they were presented with the actual donations of 13
other donors and given the chance to adjust their predictions. They adjusted their
predictions of their peers to match the base rate information but did not change their
predictions for themselves. When we have access to individuating information, we
assign it greater value than base rate information, which is why their ratings of
themselves stayed the same. However, participants did not have access to
individuating information about their peers and therefore relied on base rate
information instead.

7. Bounded Rationality
To act according to perfect rationality would require us to not be influenced by any
cognitive biases, to be able to access all possible information about potential
alternatives, and have enough time to calculate and project the benefits and
detriments of each possible choice. 
Since it is next to impossible to make decisions that satisfy all those factors we take
shortcuts and make decisions that satisfy us, even if they are not the most optimal.
Within our temporal and cognitive limitations, we make choices to the best of our
understanding and ability, meaning that we are still rational, but not perfectly so. 
Example 1 – Supply chain management
According to a model based on perfect economic rationality, company decision-
makers would make decisions for their supply-chain that would yield the greatest
profit. However, such a model would not take into account other factors like
reputation or sustainability. Many companies make decisions for their supply chain
where cost is but one of the factors that goes into the decision-making process,
leaving room for other influences like sustainability. Bounded rationality takes into
account some of the trade-offs that managers have to make that means their
decisions do not always fall in line with perfect economic rationality. 
Example 2 – Short-term temptations
Bounded rationality can cause us to make decisions that satisfy us in the short-term,
either because we are biased by immediate gratification, or because we do not have
the capacity or time to calculate the long-term costs of our decisions. This leads to
decisions that are not optimal, such as buying appliances that have a lower initial
price but costs us more over time because of energy costs, or to put our preferences
to the side to receive a reward sooner.

8. Cognitive Dissonance
Cognitive dissonance occurs when there is an uncomfortable tension between two or
more beliefs that are held simultaneously. This most commonly occurs when our
attitudes and behavior do not align with our attitudes – we believe one thing, but act
against those beliefs. The resulting discomfort motivates us to pick between beliefs
by rationalizing one and rejecting or delegitimizing the other(s). We tend to pick the
belief or idea that is most ingrained in us, which is the one we already hold. It is
natural for us to look for internal psychological consistency, as it forms our identity
and allows us to make sense of the world.
Example 1 – Avoiding the doctor
A 2016 analysis of two studies by researchers Michael Ent and Mary Gerend details
our reluctance to undergo beneficial medical screenings. In one of the studies,
participants were told about an unpleasant test for a virus. Half of them were told
they qualified for testing, and the other half was told they did not. Results showed
that eligible participants reported less favorable attitudes toward the unpleasant
screening than those who were ineligible. Participants were caught in a clash
between the obligation they feel towards maintaining their health through screening,
and the discomfort of going through the screening. To deal with this dissonance,
many participants looked down on the screening.
Example 2 – Not listening to the other side
We have a tendency to interpret information given by our political adversaries in a
way that meshes with our own political convictions. A 2002 study investigated the
tendency for political enemies to derogate each other’s compromise proposals by
conducting studies on Palestinian-Israeli perceptions. Israeli Jews were found to
evaluate a peace plan less favorably when it was attributed to the Palestinians than
when it was attributed to their own government. In reality, the peace plan was
actually Israeli-authored. One reason for this, the researchers conclude, is cognitive
dissonance. Adversaries may devalue or reject peace proposals in order to rationalize
their history and beliefs.

9. Commitment Bias
When our past decisions lead to unfavorable outcomes, we feel the need to justify
them to ourselves, as well as others. This results in us developing an argument in
support of this behavior, which can cause us to change our attitudes towards it.

Example 1 – Sunk cost fallacy


Sunk cost fallacy is a form of commitment bias. It refers to how we feel the need to
follow through with something once we’ve invested time and/or money into it. 
Example 2 – DARE
Despite the evidence against its effectiveness, the government continues to provide
funding for the Drug Abuse Resistance Education (DARE) Program. This is an
example of commitment bias, as it illustrates our continued commitment to a cause,
in spite of its unfavorable outcomes.
10. Confirmation Bias
Confirmation bias is a cognitive shortcut we use when gathering and interpreting
information. Evaluating evidence takes time and energy, and so our brain looks for
such shortcuts to make the process more efficient. We look for evidence that best
supports our existing hypotheses because the most readily available hypotheses are
the ones we already have. Another reason why we sometimes show confirmation bias
is that it protects our self-esteem. No one likes feeling bad about themselves– and
realizing that a belief they valued is false can have this effect. As a result, we often
look for information that supports rather than disproves our existing beliefs.

Example 1 – Blindness to our own faults


A 1979 study by Stanford researchers found that after being confronted with equally
compelling evidence in support of capital punishment and evidence that refuted it,
subjects reported feeling more committed to their original stance on the issue. The
net effect of having their position challenged was a re-entrenchment of their existing
beliefs.
Example 2 – Establishing personalized networks online
Modern preference algorithms have a “filter bubble effect,” which is an example of
technology amplifying and facilitating our cognitive tendency toward confirmation
bias. Websites use algorithms to predict the information a user wants to see, and
then provide information accordingly. We normally prefer content that confirms our
beliefs because it requires less critical reflection. So, filter bubbles might exclude
information that clashes with your existing opinions from your online experience.
Filter bubbles and the confirmation bias they produce has been shown to influence
elections and may inhibit the constructive discussion democracy rests on.

You might also like