You are on page 1of 4

Confirmation Bias: This is favoring information that conforms to your existing beliefs and discounting evidence

that does not conform.


Availability Heuristic: This is placing greater value on information that comes to your mind quickly. You give
greater credence to this information and tend to overestimate the probability and likelihood of similar things
happening in the future.
Halo Effect: Your overall impression of a person influences how you feel and think about his or her character.
This especially applies to physical attractiveness influencing how you rate their other qualities.
Self-Serving Bias: This is the tendency to blame external forces when bad things happen and give yourself
credit when good things happen. When you win a poker hand it is due to your skill at reading the other players
and knowing the odds, while when you lose it is due to getting dealt a poor hand.
Attentional Bias: This is the tendency to pay attention to some things while simultaneously ignoring others.
When making a decision on which car to buy, you may pay attention to the look and feel of the exterior and
interior, but ignore the safety record and gas mileage.
Actor-Observer Bias: This is the tendency to attribute your own actions to external causes while attributing
other people's behaviors to internal causes.
The way we perceive others and how we attribute their actions hinges on a variety of variables, but it can be
heavily influenced by whether we are the actor or the observer in a situation. When it comes to our own
actions, we are often far too likely to attribute things to external influences. You might complain that you
botched an important meeting because you had jet lag or that you failed an exam because the teacher posed
too many trick questions.
When it comes to explaining other people’s actions, however, we are far more likely to attribute their
behaviors to internal causes. A colleague screwed up an important presentation because he’s lazy and
incompetent (not because he also had jet lag) and a fellow student bombed a test because she lacks diligence
and intelligence (and not because she took the same test as you with all those trick questions).
Functional Fixedness: This is the tendency to see objects as only working in a particular way. If you don't have
a hammer, you never consider that a big wrench can also be used to drive a nail into the wall. You may think
you don't need thumbtacks because you have no corkboard on which to tack things, but not consider their
other uses. This could extend to people's functions, such as not realizing a personal assistant has skills to be in
a leadership role.
Anchoring Bias: This is the tendency to rely too heavily on the very first piece of information you learn. If you
learn the average price for a car is a certain value, you will think any amount below that is a good deal,
perhaps not searching for better deals. You can use this bias to set the expectations of others by putting the
first information on the table for consideration.
Misinformation Effect: This is the tendency for post-event information to interfere with the memory of the
original event. It is easy to have your memory influenced by what you hear about the event from others.
Knowledge of this effect has led to a mistrust of eyewitness information.
False Consensus Effect: This is the tendency to overestimate how much other people agree with you.
Optimism Bias: This bias leads you to believe that you are less likely to suffer from misfortune and more likely
to attain success than your peers.
The hindsight bias : is a common cognitive bias that involved the tendency of people to see events, even
random ones, as more predictable than they are.
Availability cascade: A self-reinforcing process in which a collective belief gains more and more plausibility
through its increasing repetition in public discourse (or "repeat something long enough and it will become
true")
Bandwagon effect: The tendency to do (or believe) things because many other people do (or believe) the
same. Related to groupthink and herd behavior.
Bias blind spot: The tendency to see oneself as less biased than other people, or to be able to identify more
cognitive biases in others than in oneself
Belief bias is the tendency to judge the strength of arguments based on the plausibility of their conclusion
rather than how strongly they support that conclusion. A person is more likely to accept an argument that
supports a conclusion that aligns with his values, beliefs and prior knowledge, while rejecting counter
arguments to the conclusion. Belief bias is an extremely common and therefore significant form of error; we
can easily be blinded by our beliefs and reach the wrong conclusion. Belief bias has been found to influence
various reasoning tasks, including conditional reasoning, relation reasoning and transitive reasoning.

the Dunning–Kruger effect is a cognitive bias in which people mistakenly assess their cognitive ability as
greater than it is. It is related to the cognitive bias of illusory superiority and comes from the inability of people
to recognize their lack of ability. Without the self-awareness of metacognition, people cannot objectively
evaluate their competence or incompetence.

Groupthink: The psychological phenomenon that occurs within a group of people in which the desire for
harmony or conformity in the group results in an irrational or dysfunctional decision-making outcome. Group
members try to minimize conflict and reach a consensus decision without critical evaluation of alternative
viewpoints by actively suppressing dissenting viewpoints, and by isolating themselves from outside influences.

The Gambler’s Fallacy : The tendency to think that future probabilities are altered by past events, when in
reality they are unchanged. The fallacy arises from an erroneous conceptualization of the law of large
numbers. For example, "I've flipped heads with this coin five times consecutively, so the chance of tails coming
out on the sixth flip is much greater than heads."[54]
the ostrich effect is the attempt made by investors to avoid negative financial information. The name comes
from the common (but false) legend that ostriches bury their heads in the sand to avoid danger.
Originally the term was coined by Galai & Sade (2006), and was defined as "the avoidance of apparently risky
financial situations by pretending they do not exist", but since Karlsson, Loewenstein & Seppi (2009) it took
the slightly broader meaning of "avoiding to expose oneself to [financial] information that one fear may cause
psychological discomfort". For example, in the event of a market downturn, people may choose to avoid
monitoring their investments or seeking out further financial news.

Post-purchase rationalization: The tendency to persuade oneself through rational argument that a purchase
was good value.
The group attribution error refers to people's tendency to believe either (1) that the characteristics of an
individual group member are reflective of the group as a whole, or (2) that a group's decision outcome must
reflect the preferences of individual group members, even when external information is available suggesting
otherwise.[1][2][3]
The group attribution error shares an attribution bias analogous to the fundamental attribution error.[2] Rather
than focusing individual's behavior, it relies on group outcomes and attitudes as its main for conclusions.

Naïve realism is the human tendency to believe that we see the world around us objectively, and that people
who disagree with us must be uninformed, irrational, or biased.
Naïve realism provides a theoretical basis for several other cognitive biases, which are systematic errors when
it comes to thinking and making decisions. These include the false consensus effect, actor-observer bias, bias
blind spot, and fundamental attribution error, among others.
The term, as it is used in psychology today, was coined by social psychologist Lee Ross and his colleagues in
the 1990s.[1][2] It is related to the philosophical concept of naïve realism, which is the idea that our senses allow
us to perceive objects directly and without any intervening processes. [3] Social psychologists in the mid-20th
century argued against this stance and proposed instead that perception is inherently subjective.[4]

The just-world hypothesis or just-world fallacy is the cognitive bias (or assumption) that a person's actions


are inherently inclined to bring morally fair and fitting consequences to that person, to the end of all noble
actions being eventually rewarded and all evil actions eventually punished. In other words, the just-world
hypothesis is the tendency to attribute consequences to—or expect consequences as the result of—a
universal force that restores moral balance. This belief generally implies the existence of
cosmic justice, destiny, divine providence, desert, stability, or order, and has high potential to result in fallacy,
especially when used to rationalize people's misfortune on the grounds that they "deserve" it.
The hypothesis popularly appears in the English language in various figures of speech that imply guaranteed
negative reprisal, such as: "you got what was coming to you", "what goes around comes around", "chickens
come home to roost", "everything happens for a reason", and "you reap what you sow". This hypothesis has
been widely studied by social psychologists since Melvin J. Lerner conducted seminal work on the belief in a
just world in the early 1960s.[1] Research has continued since then, examining the predictive capacity of the
hypothesis in various situations and across cultures, and clarifying and expanding the theoretical
understandings of just-world beliefs
Authority bias is the tendency to attribute greater accuracy to the opinion of an authority figure (unrelated to
its content) and be more influenced by that opinion.[1] This concept is considered one of the so-called social
cognitive biases or collective cognitive biases. [2] The Milgram experiment in 1961 was the classic experiment
that established its existence.[3]
Humans usually have deep-seated duty to authority, and tend to comply when requested by an authority
figure.[4] There are scholars who explain that individuals are motivated to view authority as deserving of their
position and this legitimacy lead people to accept and obey the decisions that it makes. [2] System justification
theory articulates this phenomenon, particularly within its position that there is a psychological motivation for
believing in the steadiness, stability and justness of the current social system. [5]
In any society, a diverse and widely accepted system of authority allows the development of sophisticated
structures for the production of resources, trade, expansion and social control. Since the opposite is anarchy,
we are all trained from birth to believe that obedience to authority is right. Notions of submission
and loyalty to legitimate rule of others are accorded values in schools, the law, the military and in political
systems. The strength of the bias to obey a legitimate authority figure comes from
systemic socialization practices designed to instill in people the perception that such obedience constitutes
correct behavior. Different societies vary the terms of this dimension. [6] As we grow up, we learn that it
benefits us to obey the dictates of genuine authority figures because such individuals usually possess higher
degrees of knowledge, wisdom and power. We tend to do what our doctor advises.
Consequently, deference to authority can occur in a mindless fashion as a kind of decision-making short cut.[7]
Status quo bias is an emotional bias; a preference for the current state of affairs. The current baseline (or
status quo) is taken as a reference point, and any change from that baseline is perceived as a loss. 
Stereotype is an over-generalized belief about a particular category of people. It is an expectation that people
might have about every person of a particular group. The type of expectation can vary; it can be, for example,
an expectation about the group’s personality, preferences, or ability.
Stereotypes are generalized because one assumes that the stereotype is true for each individual person in the
category. While such generalizations may be useful when making quick decisions, however, they may be
erroneous when applied to particular individuals. Stereotypes encourage prejudice and may arise for a number
of reasons.
Risk compensation is a theory which suggests that people typically adjust their behavior in response to the
perceived level of risk, becoming more careful where they sense greater risk and less careful if they feel more
protected. Although usually small in comparison to the fundamental benefits of safety interventions, it may
result in a lower net benefit than expected.
By way of example, it has been observed that motorists drove faster when wearing seatbelts and closer to the
vehicle in front when the vehicles were fitted with anti-lock brakes. There is also evidence that the risk
compensation phenomenon could explain the failure of condom distribution programs to reverse HIV
prevalence and that condoms may foster disinhibition, with people engaging in risky sex both with and
without condoms.

Herd behaviour is a phenomenon in which individuals act collectively as part of a group, often making
decisions as a group that they would not make as an individual.
There are two generally accepted explanations of herd behaviour.
1. Firstly, the social pressure to conform means that individuals want to be accepted – and this means
behaving in the same way as others, even if that behaviour goes against your natural instincts.
2. Secondly, individuals find it hard to believe that a large group could be wrong (“2 heads are better than 1”)
and follow the group’s behaviour in the mistaken belief that the group knows something that the individual
doesn’t.
In short, herd behaviour is about making a decision based in part on the behaviour/choices of others.

You might also like