Professional Documents
Culture Documents
Action Bias
The action bias was once evolutionarily adaptive and, as such, the impulse to act is
hardwired in us as a means of survival. Patterns of reinforcement and punishments
experienced throughout our lives cause us to continue to engage in this behavior.
Other factors contribute to the maintenance of the action bias, including prior
experiences where inaction caused us to fail. Additionally, overconfidence in our
ability to predict a favorable outcome and our desire to feel in control over our
circumstances both lead us to engage in the action bias with greater frequency.
2. Affect Heuristic
The affect heuristic can be explained by dual process theory, which states that we
have two distinct cognitive systems for decision making, one that is automatic and
one that is effortful. The affect heuristic is a product of the automatic system, as it
arises from our affective state. Our emotions can also alter our perception of the risks
and benefits of a certain outcome, which is another factor that leads to this heuristic.
Example 1 – Fear appeals
Public health campaigns have used the affect heuristic to deter people from engaging
in unhealthy behavior by sharing scary or disturbing information. Anti-smoking
campaigns, for example, lead to information about the consequences of smoking and
pictures of diseased gums and lungs being added to cigarette packages in Canada. A
survey found that the more negative emotions people felt in response to these
warning labels, the more likely they were to cut back on their smoking or even quit
altogether.
Example 2 – Interpreting statistics
Statistics presenting the probability of a certain event occurring have the ability to
elicit emotional responses from us. These responses can be manipulated by the way
the information is framed. For example, clinicians in a study were less willing to
discharge a psychiatric patient when told that 20 out of every 100 individuals like
him committed a violent act in the six months after being discharged than were
clinicians who were told that 20% of patients like him acted violently in that same
time frame. This is because the relative frequency, or percentage, brought to mind
the image of an individual who had low odds of behaving violently, while the
equivalent frequency, 20 out of 100, brought to mind the image of several people
committing violent acts. The latter elicited more negative affect in the clinicians,
thereby making them less willing to discharge the patient.
3. Anchoring Bias
There are two dominant theories behind anchoring bias. The first one, the anchor-
and-adjust hypothesis, says that when we make decisions under uncertainty, we start
by calculating some initial value and adjusting it, but our adjustments are usually
insufficient. The second one, the selective accessibility theory, says that anchoring
bias happens because we are primed to recall and notice anchor-consistent
information.
4. Availability Heuristic
The brain tends to minimize the effort necessary to complete routine tasks. When
making decisions — especially ones involving probability — certain memories and
knowledge jump out to replace the complicated task of calculating statistics. Some
memories leave a lasting impression because they connect to emotional triggers.
Others seem familiar because they align with the way we process the world, such as
recognizing words by their first letter.
5. Bandwagon Effect
As an idea or belief increases in popularity, we are more likely to adopt it. The first
reason for this, is that the bandwagon effect serves as a heuristic by allowing us to
make a decision quickly. We skip the long process of individual evaluation and rely
on other people to do it for us. Because widespread popularity as a sign many people
were in favour of an idea or behavior, we also decide to adopt it. Second, to avoid
standing out and being excluded as a result, many of us support the behavior or ideas
of a group we find ourselves in. Third, we accept the majority opinion because we
want to be on the ‘winning side.’ It may be the case that we have evolved to
instinctively support popular beliefs because standing against the tide represented by
the majority can be disadvantageous at best and dangerous at worst.
7. Bounded Rationality
To act according to perfect rationality would require us to not be influenced by any
cognitive biases, to be able to access all possible information about potential
alternatives, and have enough time to calculate and project the benefits and
detriments of each possible choice.
Since it is next to impossible to make decisions that satisfy all those factors we take
shortcuts and make decisions that satisfy us, even if they are not the most optimal.
Within our temporal and cognitive limitations, we make choices to the best of our
understanding and ability, meaning that we are still rational, but not perfectly so.
Example 1 – Supply chain management
According to a model based on perfect economic rationality, company decision-
makers would make decisions for their supply-chain that would yield the greatest
profit. However, such a model would not take into account other factors like
reputation or sustainability. Many companies make decisions for their supply chain
where cost is but one of the factors that goes into the decision-making process,
leaving room for other influences like sustainability. Bounded rationality takes into
account some of the trade-offs that managers have to make that means their
decisions do not always fall in line with perfect economic rationality.
Example 2 – Short-term temptations
Bounded rationality can cause us to make decisions that satisfy us in the short-term,
either because we are biased by immediate gratification, or because we do not have
the capacity or time to calculate the long-term costs of our decisions. This leads to
decisions that are not optimal, such as buying appliances that have a lower initial
price but costs us more over time because of energy costs, or to put our preferences
to the side to receive a reward sooner.
8. Cognitive Dissonance
Cognitive dissonance occurs when there is an uncomfortable tension between two or
more beliefs that are held simultaneously. This most commonly occurs when our
attitudes and behavior do not align with our attitudes – we believe one thing, but act
against those beliefs. The resulting discomfort motivates us to pick between beliefs
by rationalizing one and rejecting or delegitimizing the other(s). We tend to pick the
belief or idea that is most ingrained in us, which is the one we already hold. It is
natural for us to look for internal psychological consistency, as it forms our identity
and allows us to make sense of the world.
Example 1 – Avoiding the doctor
A 2016 analysis of two studies by researchers Michael Ent and Mary Gerend details
our reluctance to undergo beneficial medical screenings. In one of the studies,
participants were told about an unpleasant test for a virus. Half of them were told
they qualified for testing, and the other half was told they did not. Results showed
that eligible participants reported less favorable attitudes toward the unpleasant
screening than those who were ineligible. Participants were caught in a clash
between the obligation they feel towards maintaining their health through screening,
and the discomfort of going through the screening. To deal with this dissonance,
many participants looked down on the screening.
Example 2 – Not listening to the other side
We have a tendency to interpret information given by our political adversaries in a
way that meshes with our own political convictions. A 2002 study investigated the
tendency for political enemies to derogate each other’s compromise proposals by
conducting studies on Palestinian-Israeli perceptions. Israeli Jews were found to
evaluate a peace plan less favorably when it was attributed to the Palestinians than
when it was attributed to their own government. In reality, the peace plan was
actually Israeli-authored. One reason for this, the researchers conclude, is cognitive
dissonance. Adversaries may devalue or reject peace proposals in order to rationalize
their history and beliefs.
9. Commitment Bias
When our past decisions lead to unfavorable outcomes, we feel the need to justify
them to ourselves, as well as others. This results in us developing an argument in
support of this behavior, which can cause us to change our attitudes towards it.