You are on page 1of 16

UNIVERSITY OF SAINT LOUIS

Tuguegarao City

SCHOOL OF EDUCATION, ARTS and SCIENCES


Second Semester
A.Y. 2022-2022

CORRESPONDENCE LEARNING MODULE


PSYC 1103- Social Psychology

Prepared by:

MELANIE T. BUCO
Course Instructor

Reviewed by:

RENZ MARION C. GAVINO,MP


Head, General Education Area

Recommended by:

VENUS I. GUYOS, Ph.D.


Academic Dean

Approved by:

EMMANUEL JAMES P. PATTAGUAN, Ph.D.


Vice President for Academics

PSYC 1103- Social Psychology| 1


UNIVERSITY PRAYER

O God, wellspring of goodness and blessings, we give you thanks and praise as one Louisian community. The
graces You incessantly grant upon us and Your divine providence have sustained our beloved University
throughout the years of mission and excellence.

Having been founded by the Congregation of the Immaculate Heart of Mary, we pray that You keep us
committed and dedicated to our mission and identity to serve the Church and the society as we become living
witnesses to the Gospel values proclaimed by Jesus. For if we are steadfast in our good and beautiful mission,
our works will bring success not only to ourselves but also to those whom we are bound to love and serve.

Inspired by St. Louis our Patron Saint, who was filled with a noble spirit that stirred him to love You above all
things , may we also live believing that we are born for a greater purpose and mission as we dwell in Your
presence all the days of our life.

Grant all these supplications through the intercession of


Mother Mary and through Christ our Lord. Amen.

PSYC 1103- Social Psychology| 2


CORRESPONDENCE LEARNING MODULE
PSYC 1103 (Social Psychology)
AY 2021-2022

Module 2: Week 3-4

Topics:  Social Cognition

Learning Outcomes: At the end of this module, you are expected to:
 choose one concept among the three sources of intuition and recall
one instance in your life where this concept applies to you
 provide ways to combat negativity bias

Date Topics Activities or Tasks


January 24- How do we think? Read and understand the lesson
28 How do we remember social
information?
Why do our brains sometimes makes Accomplish and submit the worksheet in
mistakes? the Activities portion of the LMS

January 31- From where does intuition come? Read and understand the lesson
February 5 Can we trust our intuition? Accomplish the worksheet in the Activities
Portion of LMS
Submission of learning task

Lesson Proper

Sgt. First Class Edward Tierney was leading a nine-person patrol in Mosul, Iraq, when they noticed a car
parked on the sidewalk, facing the traffic. The windows were rolled up and the faces of two kindergarten-
aged boys stared out the back window, their faces close together. The nearest soldier said to Tierney,
"Permission to approach, sir, to give them some water. "No," Tierney replied and ordered his men to pull
back. Something just seemed wrong. Then a bomb exploded, killing the two boys and sending shrapnel
across the face of the nearest soldier. Unfortunately, Tierney's intuition could not save the two boys, but it
probably prevented an even greater tragedy by saving the lives of some of the men in his patrol. Sgt.
Tierney later reported experiencing "that danger feeling" and an urge to move back before he logically
knew why.

It doesn't matter whether you call it "going with your gut," "a hunch," or
intuition because it all refers to the same idea: knowing something
without knowing how you know. It is sometimes a wonderful yet
mysterious experience when it works, as it did for Sgt. Tierney. But when
we place too much faith in our intuition, it can lead to disaster. Sometimes,
we are better off to rely on logic, a way of knowing based on reasoned,
thoughtful analysis of the objective situation.

INTUITION & LOGIC- these two types of thinking, intuition and logic, are the basic elements of social
cognition, or how we process social information. We can learn more about social cognition and the
unavoidable trade-offs between intuition and logic as we go through this module.

HOW DO WE THINK?

When we try to look at ourselves and compare, us, to animals, there is not much difference as you would
observe from the basic features of these creatures. We are more alike than we are different from each
other.

However, there is one thing that we humans seem to do exceptionally well: that is, to think. We are not
unique in our ability to think; instead, our most distinctive ability may be in how we think. Social cognition
PSYC 1103- Social Psychology| 3
the study of how people combine intuition and logic to process social information. The constant interplay
between these two ways of thinking helps us synthesize enormous amounts of information quickly and
with relative accuracy. Let’s look more closely at the two thinking systems of intuition and logic.

Dual Processing: Intuition and Logic You have just been surprised by a marriage proposal you need to
figure out how to respond and fast! Will you use logic, intuition, or both? Responding quickly to complex
situations is one reason that humans developed the skill of dual processing, the ability to process
information using both intuition and logic. These two styles, forms, or systems of thinking have been
labeled in many different ways such as "System 1 versus System 2," "automatic versus effortful," "implicit
versus explicit," and more. For this chapter, we will use the terms intuition versus logic—but as we discuss
each thinking system, keep in mind that they are complex and differ in many ways (see Figure 1).

Intuition is the ability to know something quickly and automatically. It could be described as our 'gut
feelings." Intuition is extremely important when we need to sense and react to potential threats in the
environment. Intuition is implicit and requires, at that particular moment, only minimal cognitive effort. Sgt.
Tierney, for example, was not aware of the flow of information entering his brain, but his intuition enabled
him to make a lightning-quick decision that saved several lives.

Logic, by comparison, enables humans to reason, think systematically, and carefully consider evidence
about possible futures. Logic requires mental effort and careful, purposeful reasoning. What appears to
separate humans from other species is the fluency with which we combine intuition and logic for decision
making.

For example, notice what you do when you try to push your way through a PULL
door. As you approach the door, your brain is on automatic pilot and relying on
intuition. So, when you get to the door, you use intuition to make the trivial decision
about whether to push or to pull. If you successfully pull open a PULL door (or push
open a PUSH door), then you just keep on going to your destination.

But what if, based on your intuition, you try to push open a PULL door or vice versa?
If you watch people in this situation, many will continue to push, push, push and
rattle the door until their logical thinking breaks through with the logical thought,
"Hey! What you're doing isn't working. Try something else." That's when you finally
change your behavior and pull on the PULL door. We need both intuition and logic to
navigate the hundreds of big and little decisions that we make every day, and both types of thinking
sometimes lead to errors such as trying to push your way through a PULL door.

FIGURE 1

Figure 1 compares the two thinking systems, and it also


summarizes their strengths and weaknesses (Alter,
Oppenheimer, Epley, & Eyre, 2007; Gilbert, 1991;
Kahneman & Frederick, 2005; Lieberman, 2000). Quick
decision making (intuition) makes our lives much easier
and may even save our lives just like Sgt. Tierney or the
soldiers under his command. That's good. But it also risks
sometimes making a hurried, catastrophic decision
without really considering all of the logical choices and
consequences. That's bad and sometimes very bad. On
the other hand, relying on slow, thoughtful logic helps us
understand what is happening from an objective point of view. That's good. But it also risks being attacked
or missing opportunities through indecision because we are so busy analyzing information. That's bad and
sometimes very bad. The constant trade-offs between intuition and logic describe our daily decision
making and its consequences.

Our Two Thinking Systems Interact

When you first learned to drive, you probably had to put in a lot of effort and concentrate on every tiny
movement of the steering wheel and pedals. Over time, we get so used to driving especially on familiar
routes such as between home and work or school that driving becomes automatic and easy. This "auto

PSYC 1103- Social Psychology| 4


pilot" mode of driving becomes our default for routine trips. However, if you have to drive in a new,
complicated city with several lanes of traffic, unfamiliar signs, and road construction, your mental effort for
driving suddenly kicks in, you concentrate, you pay attention.

Daniel Kahneman (2011) won the Nobel Prize for his research
on intuitive versus logical decision making. These two systems
of thought often interact beautifully, like dance partners. But
sometimes one partner, either intuition or logic, has to take the
lead. The example of driving may help you to understand why
you don't want to text and drive. There are limits to your
cognitive load, the amount of information that our thinking
systems can process at one time. As the driving example
suggests, evolving traffic situations require that our two thinking
systems interact by smoothly switching back and forth between
Daniel Kahneman (right) won the Nobel Prize for
his research on how our two thinking systems intuition and logic, a process also called cognitive load-shifting.
work together to make everyday decisions You may be able to drive almost mindlessly, relying on intuition,
until you notice another car weaving dangerously up ahead of
you. Then, logic springs into action and tells you, "Concentrate! This could be dangerous! Observe,
evaluate, and figure out how to avoid that dangerous driver up ahead."

There are also times when either preference might pay off. For example, a sports psychology
study discovered that some sports tasks are better suited for intuition, while others are more successful
when the athlete switches to logical thinking (Furley, Schweizer, & Bertrams, 2015). In football, certain
physical actions (like kicking the ball for an extra point after a touchdown) might rely more on practiced
intuition. However, running a complicated play among several players probably needs more logical
thinking to be successful.

Social Thinking Is Shaped by Cultural Influences

While the focus of this module is on types of thinking (such as intuition vs. logic), it's important to point out
that social cognition is influenced by culture. What is viewed in one culture as normal and appropriate may
be regarded in another culture as strange or deficient. These differences highlight how our social thinking
is shaped by our cultural influences. The accompanying Spotlight on Research Methods feature, "Culture
Influences How We Think," describes a cultural difference based on individualistic values versus
collectivistic values. Every society is a mixture, of course, but some cultures plainly value one more than
the other.

Question to ponder upon: Think about your own childhood experiences with parents or guardians. Did
they encourage you to follow your logic or your intuition more? If you could analyze your own balance
between the two systems of thinking, which do you tend to favor more now?
Autobiographical memories about our own upbringing suggest how we may have absorbed an
individualistic value or a collectivistic value. For example, even though my [Tom's] athletic abilities peaked
in about the fifth grade of a small school, I can still recall momentarily being lifted on my teammates'
shoulders after winning a baseball game. That stubborn little vanity memory still reinforces my sense that
individual achievement will be rewarded. As you scan your autobiographical memories, can you identify
certain family interaction that encouraged either your individuality or the importance of being part of a
group?

HOW DO WE REMEMBER SOCIAL INFORMATION?

Memory structures have been crucial to the success of the human species. Notice that you are able to
recognize people when you see their faces. At first you need to concentrate on specific features, but after
a while you will automatically recognize people, it would be impossible not to immediately identify people
after you've known them for a long period of time. So let's learn a little more about your marvelous
memory structures. Our memory structures on a particular thing are firmly in place.

PSYC 1103- Social Psychology| 5


We have three types of memory structures (also called mental structures) that organize and
interpret social information: schemas, scripts' and stereotypes. These memory structures evolved
because they allow us to process large amounts of information quickly, the very thing that we humans
seem to do better than other animals. However, there is a trade-off for being able to process so much
information: we don't like to change our minds, because even a small change in thinking can have far-
reaching implications. It's similar to rearranging a room. If you move a lamp just a few inches, then
suddenly you also may have to move or reorient the other things, much more, like a mental domino
effect.

Why are schemas, scripts, and stereotypes so important when processing social information?

A. Schemas Label and Categorize

A schema operates like a spam email filter: It automatically directs and organizes incoming information by
labeling and categorizing. A schema automatically ignores some information and forces us to pay
attention to other information (Bartlett, 1932; Johnston, 2001). Common schemas people might use in the
social world are concepts such as labeling people based on their gender, their religion, or their country of
origin. Simply put, schemas are categories we use to understand the world. You will hear people use
several other words that describe this same general idea. For example, "templates," "worldviews," and
"paradigms" are also attempts to describe mental structures that help us process and remember
information. There are two specific types of schemas: scripts and stereotypes.

B. Scripts Create Expectations About What Happens Next


A script is a memory structure or type of schema that guides common social behaviors and expectations
for particular types of events. We don't have to relearn how to behave every time we go to a sit-down
restaurant, for example, because we have a mental script for the expected order of events: (a) waiting to
be taken to a table, (b) being seated, (c) reviewing the menu, (d) ordering the meal, (e) eating the meal, (f)
paying the bill, and (g) leaving the restaurant.

The script is modified for a fast-food restaurant because you have a slightly different memory
structure for how to behave in that setting. You would be surprised and confused if the order of expected
events somehow changed. For example, you're expected to pay the bill before you get your food at a fast-
food restaurant but after the meal at a sit-down restaurant. If you went to a nice steakhouse and they
made you pay before you got to eat, you would be startled and might not have a good impression of the
restaurant.

So as you can see, scripts do govern a great deal of our lives. For example, your cultural script for
a marriage proposal tells you that the photographs shown below are out of order. Knowing how to propose
properly within a particular culture is an important social script that shapes marital expectations. A survey
of over 2,100 college students at the Universities of Iowa and Alaska found that participants, regardless of
their sex or age, predicted a stronger marriage if the couple had conformed to a more traditional proposal
script (Schweingruber, Cast, & Anahita, 2008). Scripts are comforting to us because they help us feel that
we can safely navigate through a complex social world.

PSYC 1103- Social Psychology| 6


We use scripts because they are efficient ways of automatically cruising through social situations,
but some mental scripts may be harmful. For example, if someone has a script about marriage that
includes sexist ideas about
male privilege (e.g., the man is
the "king of his castle" and
therefore can make all the
important decisions), this
particular misogynistic script
can lead to abusive
relationships (Johnson, 2007).

C. Stereotypes Ignore
Individual Differences
Within Groups
A second type of schema is stereotypes, which assume everyone in a certain group has the same
characteristics. In other words, stereotypes minimize individual differences or diversity within any given
group based on the perception that everyone in that group is the same. You may not behave according to
the stereotype, but whatever groups you belong to—fraternity, psychology major, garage band, or football
team trigger a stereotype that others use to make judgments about you. Some aspects of a stereotype
maybe true for some members of a particular group, of course, but it's still a logical mistake to assume
that all members of the group are the same.

This thinking error is called outgroup homogeneity, or assuming that every individual in a group
outside of your own is the same. Do you think that all students at Harvard are the same? Unless you are a
student at Harvard, then Harvard students are your outgroup, and if you believe they all share the same
behaviors or characteristics, then you have a stereotype. Stereotypes and prejudice play major roles in
social psychology's story. Like other mental shortcuts, stereotypes are efficient ways of making decisions
but also can lead to errors in judgment which sometimes lead to disastrous effects.

WHY DO OUR BRAINS SOMETIMES MAKE MISTAKES?

Our big, busy brains sometimes make mistakes. There seem to be two main sources for our mental
errors. You can think of the first source by the familiar shorthand TMI for " too much information," or
information overload. For example, we are aggressively confronted—usually without our permission—by
advertising when we are on the Web, driving our car, listening to the radio, and even grocery shopping.
These advertisers are all fighting for a spot in our thought life—which already has enough to think about.
So our brains automatically simplify our world by rejecting most of the incoming information, including
some information we actually want or need.

The information overload (TMI) problem is understandable, and so is the second source of mental
mistakes: wishing—especially when we substitute wishes for reality. It is tempting to believe that
wishing hard enough somehow will make our wishes come true. Unfortunately, no matter how many
Disney films take advantage of our fondness for self-deception, the wooden Pinocchio will never turn into
a real boy, and neither beasts nor frogs turn into princes.

The next time you see a police car with its lights flashing, watch your storytelling brain start spitting
out explanations. "Something happened, you tell yourself. "Maybe it's an accident, a robbery, a speeding'
And your favorite explanation will probably be the one that you wish to believe. You might recognize
yourself as we explore these two types of mental mistakes.

Information Overload Leads to Mental Errors

In the face of overwhelming amounts of information, the brain evolved an


efficient but imperfect solution: throw most things out and organize what
are some specific ways our brains deal remains. What are some specific
ways our brains deal with information overload?

Cognitive Misers. Let's be blunt about it: We are cognitive misers who
take mental shortcuts whenever possible to minimize the cognitive load

Our brains don’t like to process PSYC 1103- Social Psychology| 7


too much information at any
given time.
(Hansen, 1980; Taylor, 1981). An economic "miser" hates spending money unless absolutely necessary.
Likewise, our brains are cognitive "misers" that avoid effortful thinking unless we absolutely have to. It
sounds like a lazy way to think, and it is—although economical or efficient might be a more precise word
than lazy. Under most circumstances, the brain searches for the shortest, quickest way to solve a
problem, partly because logical brainwork is slow, hard work!

For example, Daniel Kahneman (2011) noticed that people will stop walking if you ask them to calculate a
difficult arithmetic problem in their heads. Try it for yourself by taking a little stroll and then trying to
calculate 23 x 278 in your head. Even a leisurely stroll feels like effort to a brain that is being asked to do
too much—so we stop and do one thing at a time. As we pointed out earlier in this chapter, we are
generally very bad at "multitasking," and we're also bad at thinking about too much at any given time. We
prefer to stick to decisions once we've made them, because reconsidering other options is effortful. It's
such a relief once you make up your mind that few people will really stay open to other ideas; our miserly
mental habits just say no.

Satisficers Versus Maximizers.

Our practical solution to the problem of information overload is satisficing, decision making based on
criteria that are "good enough" under the circumstances (see Nisbett, Krantz, Jepson, & Kunda, 1983).
When you buy a car, you won't read every review of every system and subsystem about every car you
might purchase; you satisfice. When shopping for shampoo, you won't test every single product available
at the megamart; you satisfice. At least some students study only until they believe their knowledge is
good enough to achieve whatever grade they desire; they satisfice. Satisficing enables you move on to
the next thing demanding your attention.

We satisfice when making both minor decisions, such as what consumer products buy (Park &
Hastak, 1994; Simon, 1955), and major decisions, such as choice of college major or career goals
(Starbuck, 1963). we make some errors by satisficing, but we do it anyway because "perfection may not
be worth the extra cost" (Haselton & Funder, 2006; Simon, 1956). Of course, some people don't like to
satisfice—they
enjoy thinking through all the options, or they worry about making the wrong decision so they are more
careful. The scale in the Applying Social Psychology to Your Life feature will tell you whether you
generally tend to be a satisficer or the opposite a maximizer—when making everyday decisions.

Satisficers, by the way, seem to be happier than maximizers (Snyder & Miene, 1994). Several
studies (Polman, 2010; Schwartz et al., 2002; Sparks, Ehrlinger, & Eibach, 2012) suggest that trying to
gather all available information for every decision is associated with lower levels of happiness, optimism,
life satisfaction, and self-esteem and with higher levels of depression, perfectionism, and regret. Perhaps
that is why another Nobel Prize winner, Herbert Simon, suggested that satisficing directs many, and
perhaps all, of our mental shortcuts (Simon, 1956). We tolerate mental errors because being perfect just
isn't worth the price.

Magical Thinking Encourages Mental Errors

In her book titled The Year of Magical Thinking, Joan Didion (2005) described how her brain refused to
accept the death of her husband of 40 years, fellow writer John Gregory Dunne. Her thoughts replayed
thousands of little things that she might have done to prevent his heart attack. She found herself against
all rationality that she might somehow find a way to restore him to life. “If only” thinking, she slowly
realized during to give in to magical thinking, beliefs based on assumptions that do not hold up to reality.
But in the throes of her grief, she could not stop thinking, “If only…”

"If Only .. ." Wishes. There are emotional consequences when we try to magically wish bad things away or
good things into existence (Boninger, Gleicher, & Strathman, 1994; Bouts, Spears, & van der Pligt, 1992;
Kahneman & Tversky 1982; Roese, 1997). For example, if you are waiting in line for a free ticket to a
concert, then you probably will be more disappointed if the person just in front of you got the last ticket
than if you were still at the back of the line when tickets ran out. It's easier to imagine that "if only" you had
changed one little thing in your schedule, then you would have been going to the concert.

PSYC 1103- Social Psychology| 8


The same kind of "if only" thinking leads Olympic athletes
like those in the photograph below—who finish in third place to feel
happier than second place competitors. Why? Second place
encourages the magical thinking, "If only I had done one little thing
differently, I could have won gold." On the Other hand, the bronze
medalist reasons that if circumstances had been just a little
different, he or she might not have received any medal at all! They
can think, "At least I got a medal!" Emotionally, third place might
provide a better experience than second place because it leads to Which Olympic medalist is the
happiness and relief rather than frustration and regret. happiest--- and which is the second
happiest?
Counterfactual Thinking: Upward and Downward.

Counterfactual thinking occurs when we imagine what might have been—alternative facts or events in the
past that would have led to a different future (Davis & Lehman, 1995; Davis, Lehman, Silver, Wortman, &
Ellard, 1996; Davis, Lehman, Wortman, Silver, & Thompson, 1995; Dunning & Madey, 1995; Dunning &
Parpal, 1989; Einhorn & Hogarth, 1986). Counterfactual thinking occurred automatically to a group of
tourists who somehow survived the terrible tsunami that took 280,000 lives across Southeast Asia on
December 26, 2004.

When Teigen and Jensen (2011) interviewed 85 surviving


tourists, both parents and children, most survivors comforted
themselves, as much as they could, with downward counterfactuals,
imagined outcomes that are even worse than reality. "Only a matter
of 1 [minute], one way or the other," reported one interviewee, "and
everything would have been different [we might have died] They can
be comforting, like a "silver lining" to tragedy. By contrast, upward
counterfactuals are imagined outcomes that are better than reality.
One interviewee commented, "They could have issued a warning,"
The 2004 tsunami as it hit Thailand
when imagining what might have led to a better outcome. They
lead to anger or regret because you can see what might have
been. Downward counterfactuals can be thought of as "at least" thoughts, while upward counterfactuals
can be thought of as "if only/' thoughts.

For example, upward counterfactual thinking was upsetting to some college students (Leach &
Patall, 2013) when they imagined a path that would have led to better test performance or grades.
Although not earning a higher grade can't be compared to a devastating tsunami, the underlying
psychology was the same. These students were dissatisfied when they had thoughts such as, "If only I
had studied more, then my GPA would have improved." On a positive note, however, upward
counterfactuals help us learn from our mistakes. The thought, "If only I had studied harder for the exam"
might motivate us to study more the next time around.

The Optimistic Bias and the Planning Fallacy.

To-do lists suggest another common form of magical thinking. If you keep a to-do list, how often do you
actually check off every single item? I [Wind] keep an electronic sticky note on my computer desktop, on
which I make a list at the beginning of each week of everything I need to get done. Although I have never,
ever completed the entire list in a week, I have been making the list regularly for 7 years. My continued
behavior demonstrates the optimistic bias, an unrealistic expectation that things will turn out well.

One survey suggests that the optimistic bias is a popular way to think. Weinstein (1980) asked
more than 250 college students to estimate their chances of experiencing 42 positive and negative events
compared to the average probability that their fellow classmates would experience those same events.
Positive events included liking your job after graduation, owning your own home, and having a respectable
starting salary.

Negative events included having a drinking problem, attempting suicide, and getting divorced
shortly after marrying. As the optimistic bias suggests, the students rated their chances of experiencing
positive events as above average and their chances of experiencing negative events as below average.

PSYC 1103- Social Psychology| 9


There's only one problem: We can't all be above average. At least some of these students must have
been deceiving themselves about how nicely their lives were going to turn out after graduation.

Students also tend to be overly optimistic about their test performance (Gilovich, Kerr, & Medvec,
1993), and they are not alone in their optimism. Teens underestimate their likelihood of eventually
becoming the victim of dating violence (Chapin & Coleman, 2012) or sexual assault (Untied & Dulaney,
2015). Potential blood donors overestimate their probability of actually making a donation (Koehler &
Poon, 2006). Students who steal music instead of paying for it underestimate their chances of getting
caught (Nandedkar & Midha, 2012). And those who sincerely make extravagant promises in romantic
relationships tend not to follow through as well as those who make more modest commitments (Peetz &
Kammrath, 2011). The people making these romantic commitments may sincerely mean what they say at
the moment, but sometimes it's just the
optimistic bias talking.

The optimistic bias sounds like a


good thing, but it can create big
problems. The planning fallacy is
unjustified confidence that one's own
project, unlike similar projects, will
proceed as planned. Thus, the planning fallacy is one specific type of optimistic bias.

Like Baumeister's (1989) optimal margin theory of self-deception (described in module 1, a little
optimism can be helpful, but too much can be harmful.

FROM WHERE DOES INTUITION COME?

Research about intuition demonstrates how science sometimes makes exciting explanations boring. It
tickles our vanity, of course' to imagine that our private dreams somehow predict the future or that our
hunches are supernatural insights. Prophesy' psychic gifts, and mental telepathy are much more
entertaining than what science has come up with. Are scientists just out to spoil the fun?

Well, yes, depending on how you define "fun." Scientists find their fun by forming their beliefs
around the principle of parsimony, preferring the simplest answer that explains the most evidence.
Parsimony also means you can think of scientists as intellectual bargain hunters who want a great theory
without having to pay for it with exotic explanations. Relying on the simplest, evidence-based explanation
is also called "Occam's razor" (Wind's favorite) and "the principle of least astonishment" (Tom's favorite).

The science of social cognition explains intuition with an explanation so ordinary that it does not
require any dramatic leaps of faith. Intuition comes from mental accessibility, the ease with which ideas
come to mind. Figure 2 describes how three silent sources of intuition (priming, experience, and
heuristics) influence what comes most easily to mind. "Mental accessibility" may sound like an ordinary
explanation. But intuition grows more mysterious as we follow the trail of evidence deeper into the human
brain.

FIGURE 2: Three frequent sources of intuition: priming, experience, and heuristics

PRIMING

EXPERIENCE MENTAL INTUITION


ACCESSIBILITY N
HEURISTICS

Intuition Relies on Mental Accessibility

For Sgt. Tierney (described at the beginning of this module), his life-saving flash of intuition had to
penetrate two distinctive barriers: mental availability and mental accessibility. First, Sgt. Tierney's brain
had to privately reach the conclusion that there was something suspicious about two kindergarten boys
locked inside a car on a hot day (mental availability—the knowledge is present in your mind). Second,
Sgt. Tierney had to be able to access that information (mental accessibility—you can gain access to that
knowledge). As patrol leader, Sgt. Tierney had many other competing claims on his attention.
PSYC 1103- Social Psychology| 10
When it works, intuition is astonishing. If you are 20 years old, then a relatively conservative
estimate is that you know about 40,000 distinct words derived from more than 11,000 word families—and
you will keep on learning new words (Brysbaert, Stevens, Mandera, & Keuleers, 2016). That works out to
about 2,100 new words per year or 40 new words every week. Did you consciously learn 40 new words
every week, every year, from the day you were born? Of course not. You learned those words (and how to
use them) in spectacular bursts during critical periods of brain development (Wasserman, 2007). You
acquired most of your knowledge intuitively, through context and deduction, without knowing that you
were learning. Your intuitive knowing extends to your listening skills; you can recognize words within just
200 milliseconds (a fifth of a second) and understand words tumbling out of other people's mouths at a
rate of six syllables per second (Aitchison, 2003).

As Figure 2 indicates, mental relies on three major sources. This section of the chapter reviews the
first two (priming and experience) and provides a spotlight on Research Methods feature from a famous
priming study. The next section will cover the third source, heuristics, in depth and provide several
examples.

Priming Increases Mental Accessibility

If you played a word association game starting with the word dogs, you might associate „dogs" with "cats,"
"barking," or even ' 'pets." The word pets is probably associated with "hamsters," "snakes," "goldfish ", and
perhaps even "taking responsibility" (for feeding and caring for pets). Each word branches off into new
sets of words. The entire collection of associations is called a semantic network, mental concepts that
are connected to one another by common characteristics. Priming refers to the initial activation of a
concept (such as "dogs") that subsequently flashes across our semantic network that allows particular
ideas to come more easily to mind (Cameron, Brown-Iannuzzi, & Payne' 2012). Once one category is
primed, other related categories are also primed. The term priming means that after initially thinking
about something, thinking about it again later will be easier and faster. It's like priming an engine to get it
started or priming a wall before you paint it. Priming is preparation for what comes later.

Bargh, Chen, and Burrows (1996) designed an experiment using word puzzles to explore the
effects of priming. They created three experimental conditions by having college students complete word
puzzles featuring (a) words concerned with being rude (e.g., disturb, intrude, obnoxious), (b) words
concerned with being polite (e.g., respect, considerate, cordially), or (c) neutral words (e.g., normally,
send, clears). Depending on the condition, the concept of politeness or rudeness was primed (or nothing
was primed in the neutral control condition). The researchers then sent the students on an errand that
required them to interrupt two people who were talking and ask them for directions. A higher percentage
of participants who had been primed with rude words interrupted the two people talking; a lower
percentage of participants primed to be polite interrupted the conversation (and the difference between
groups was statistically significant).

In a similar experiment, they discovered that people walked more slowly after being primed with
words related to aging (e.g., Florida, gray, careful, bingo), compared to a neutral control condition. The
researchers believed that priming words related to aging led to slower walking because it increased
mental accessibility to participants' semantic network related to what it means to grow old. Even though
the researchers never specifically primed anything about walking or speed, simply priming the mental
category of old age led to priming of all the mental constructs related to that category; all of the relevant
concepts became more accessible.

There has been considerable controversy over this particular study, however. Other researchers
using similar procedures have not been able to replicate the slower movements of students down a
hallway after being exposed to words associated with aging (Doyen, Klein, Pichon, & Cleeremans, 2012).
In science, others repeating your experiment and getting the same results is more than just a good idea.
Replication is necessary. Despite the lack of replication, we have chosen to include this second study (the
one done in 2012) in our discussion about priming because the researchers did find the effect when the
researchers expected to see it—in other words, they were priming themselves instead ofthe participants!
Perhaps that's why they named their study Behavioral Priming: It's All in the Mind, but Whose Mind?

Experience Improves Mental Accessibility

PSYC 1103- Social Psychology| 11


Experience may be intuition's best teacher. For example, college students' brief, intuitive observations of a
teacher (totaling only 30 seconds) at the start of a semester were fairly good predictors of average student
ratings of the teacher at the end of the semester (Ambady & Rosenthal, 1993; Babad, Avni-Babad, &
Rosenthal, 2004). Why are students so good at sizing up teachers? Well, what have you been doing over
the past 15 to 20 years? You have been going to classes with many different teachers. Consequently you
have gotten very good at intuiting in the first few seconds how you will feel about the teacher and the class
by the end of the year.

We sometimes cannot say what we saw, but we saw it, because we had seen it many times
before. Experience had sharpened our intuition.

____________________________________________________________________________________

CAN WE TRUST OUR INTUITION? THE ROLE OF HEURISTICS AND BIASES

We've seen that both priming and personal experience help particular ideas come more easily to mind.
Now add one more to that list: Heuristics are mental shortcuts that make it easier to solve difficult
problems. Like the shortcuts that Tom likes to take when driving, sometimes heuristics don't work out and
may even make get us lost. In other words, they can lead to mistakes.

Heuristics Facilitate Mental Accessibility

A third-year medical student correctly diagnosed that a young patient's strange symptoms were the result
of a skull deformity rather than what everyone else had assumed (some kind of neurochemical brain
disorder). The diagnosis had stumped everyone else. Was she (the medical student) an unusually brilliant,
magically gifted, intuitive diagnostician? No. Over lunch, she had been leafing through a journal that some
previous student had left behind and found an article discussing the effects of particular skull deformities.
The article had become mentally accessible to her over lunch, but her casual sharing of the idea after
lunch seemed to signal genius to her fellow medical students and professors.

In a stress-free, no-hurry world, there is a purely logical approach to such problem solving. An
algorithm is a systematic way of searching for an answer that will eventually lead to a solution if there is
one. Remember from the first section of this module that we are capable of both quick, intuitive thought
and slow, logical thought; using algorithms to solve problems is definitely logic. It is usually a slower but
surer way to solve a problem such as what was troubling the patient with the skull deformity. For
psychology students looking for answers in journals, digital databases such as PsycINFO now make it
more realistic to search through every (available) journal article and conference presentation. But this
takes time.

Algorithmic approaches to everyday problem solving are often impractical. So the human brain,
cognitive miser that it is, decides it is better to tolerate a few errors in the name of efficiency. When faced
with a problem, our preferred or default approach to specific problems relies on mental shortcuts,
particular kinds of heuristics. Algorithms rely on logic; heuristics rely on intuition (even though we may fool
ourselves into thinking that we are being logical). Basic research has identified several specific types of
common heuristics, but you will be introduced to just three of them: anchoring and adjustment,
availability, and representativeness.

The Anchoring and Adjustment Heuristic. The anchoring and adjustment heuristic occurs when we
make decisions by starting with an arbitrary number that unduly influences our final solution. So basically,
it is when a person starts off with an initial idea and adjusts their beliefs based on this starting point. We
seem to like just having an answer, even if it isn't a very good one. For example, answer the following
question by choosing either (a) or (b) below. Then write down your most precise estimate.

"On average, how many full-time college students in the entire United States drop out before graduation?"

____More than 200 students

or

____Fewer than 200 students

Now indicate your own most precise estimate here:______

PSYC 1103- Social Psychology| 12


When making your own estimate of how many students drop out, you probably fell victim to this heuristic.
Why? Because your most honest answer to this question is probably, "I have no idea how many college
students drop out before graduating, but it's probably much higher than 200." You may be able to think of
5 or even 10 at your own school, but your knowledge about this issue is probably uncertain. So, being a
lazy, cognitive miser, you grab at whatever fragmentary information happens to be available. In this case,
we brought to your mind what felt like a hint: 200. It was so low that it wasn't a very helpful hint but our
mentioning it made it mentally accessible. But what if, instead of hearing the number 200, we had hinted
at a different number by asking the question like this:

"On average, how many full-time college students in the entire United States drop out before graduation?"

____More than 25 million students

or

____Fewer than 25 million students

Now indicate your own most precise estimate here:______

Again, your most honest answer to this question probably would be, "I have no idea how many
college students drop out, but it's probably much lower than 25 million students." In the first question, your
estimate would be "anchored" by 200; in the second question, your answer would be anchored by 25
million.

The moral of the story here is that when you were asked to "indicate your own most precise
estimate,' your two estimates would not have met in the middle because your thinking was anchored by
the different "hints." If the obviously low number of 200 had been suggested to you, then you might
estimate as high as 30,000 students per year, or even 200,000. However, if the obviously high number of
25 million had been suggested to you, then you might estimate 2 million students per year or even
500,000 (Kahneman & Frederick, 2005; Mussweiler & Strack, 2001 ; Nisbett & Wilson, 1977; Trope &
Gaunt, 2000; Tversky & Kahneman, 1974). Your estimates would be anchored by what was mentally
accessible (either 200 or 25 million) and then adjusted upward from 200 or downward from 25 million even
though both numbers are arbitrary.

The anchoring and adjustment heuristic is based on the metaphor of


a boat in the water. The boat can float or adjust a certain amount as the
waves move it around but the anchor keeps it within a certain general area.
For the heuristic, the idea is that the initial "hint" or number provided serves
as an anchor for your thought process. Your mind can float above or below
the initial starting point, but the anchor will keep your estimates within a
certain range. When we have nothing else to go by, we will use almost any
information to anchor our mental estimates as long as it comes easily to mind
(Cervone & Peake, 1986; Marrow, 2002; Wilson, Houston, Etling, & Brekke,
1996).

The anchoring and adjustment heuristic is another indicator that much of our
thinking is guided by the principle of satisficing—being satisfied with a "good
enough" answer to a question.

The Availability Heuristic. Fame is another way that particular ideas come more easily to mind (they are
more accessible), and it has a funny effect on our social thinking: We inflate the frequency and importance
of famous people and events. For example, McKelvie (2000) found that in a memory task for a list of
names people had seen earlier, participants overestimated the frequency of famous names compared to
nonfamous (and made up) names, probably because they noticed the famous names more.

The mental accessibility of any name, of course, depends on its mental availability in the first
place. Napoleon is not famous to you if you have never heard of him. Fame is thus one example of the
availability heuristic, our tendency to overestimate the frequency or importance of something based on
how easily it comes to mind (Tversky & Kahneman, 1973).

PSYC 1103- Social Psychology| 13


The availability heuristic influences our social perceptions is subtle ways. People who spend a lot
of time on Facebook are more likely to think that their friends are happier than they are themselves and
more likely to think that life is unfair (Chou & Edge, This effect is probably because people are more likely
to post positive or even boastful status updates on Facebook than negative or embarrassing updates;
thus, we see a constant stream of happy accomplishments from our friends. To sum up, the availability
heuristic is another corner-cutting way we cognitive misers think more efficiently but also make mental
miscalculations (Dougherty, Gettys, & Ogden, 1999; Nisbett & Ross, 1980; Rothman & Hardin, 1997;
Schwarz, 1998; Taylor & Fiske, 1978; Travis, Phillippi, & Tonn, Who’s the student?
1989).

The Representativeness Heuristic. We use heuristics to solve


many everyday problems and always with the same trade-off
between efficiency and errors. For example, when you need a
store clerk, you probably would not use the algorithmic method of
approaching everyone in the store and saying, "Do you work
here?" Instead, you would more likely substitute the easier-to-
answer (and much more efficient!) heuristic of looking for
someone wearing a uniform or a nametag (Kahneman &
Frederick, 2005; Shepperd & Koch, 2005).

Looking for someone dressed like a typical store clerk is


an example of a thinking strategy called the representativeness
heuristic, a way of answering a question by classifying observations according to how closely they
resemble the "typical" case. The representative heuristic substitutes an easier question ("What does a
store clerk typically wear?") for the laborious algorithmic task of asking every person in the store, "Do you
work here?" So we look for someone who "represents" our idea of a stereotypical clerk.

Of course, some shoppers may also be dressed as if they could be store clerks, and some store
clerks may appear to be ordinary customers. You have to balance the small risk of making an error
against the reward of finding a clerk quickly. Like pushing your way through a "push" door, the
representativeness heuristic often works so well that we don't even notice that we are using it.

WE CAN RESPECT—BUT NOT TRUST—OUR INTUITIONS?

Heuristics are mental shortcuts that, sometimes lead to errors. However, it would be just as wrong to
conclude that we should never trust our intuition as it is to believe that we can always trust our intuition.
We cannot avoid using our intuition; we need our intuition, and sometimes our intuition is wonderfully
insightful. So the intelligent middle ground is to respect but not blindly trust our intuitions.

The Confirmation Bias: A Dangerous Way to Think.

Without realizing it, we often think in


terms of the confirmation bias,
searching for evidence that confirms
what we already believe and ignoring
evidence that contradicts our beliefs.
For example, let's say your intuition is
flirting with astrology as a way to
describe personality. It seems to
describe you and others you know well.
As your belief grows a little stronger,
you start to "see" more evidence that
confirms what you are starting to
believe. Your roommate is a Pisces
and they are supposed to be moody—
and you start to notice how moody your
roommate is. However, looking at astrology through the lens of science tells a different story. Dean and

PSYC 1103- Social Psychology| 14


Kelly (2003) did not find correlated personalities when they studied 2,100 time-twins (individuals born
within minutes of one another), who should have been similar if astrology were accurate.

Nickerson (1998) believes that the confirmation bias is the leading cause of "disputes, altercations,
and misunderstandings that occur among individuals, groups, and nations". Confirmation bias can work
with any belief system.

Here’s a minute and a half video to further explain the concept of confirmation bias.

https://www.youtube.com/watch?v=6xMaR8au-YU

The Hindsight Bias: A Self-Deceiving Way to Think. The hindsight bias occurs when you believe that
you could have predicted an outcome but only after you know what happened. It's the false belief that "I
knew it all along." The hindsight bias may be just as subtle and dangerous as the confirmation bias
because it creates an illusion of understanding that makes it difficult to learn from the past (Fischhoff,
1975, 2002, 2007).

For example, since we can now "connect the dots" that led to the 9/11 terror attacks, we falsely
believe that we should have been able to predict it (see Bernstein, Erdfelden Meltzoff, Peria, & Loftus,
2011). Authorities conveniently forget about February 26, 1993. A yellow Ryder van was exploded in a
failed attempt to tip the North Tower into the south Tower. Despite this attack, everyone was shocked and
surprised at the second, successful attack; at the time, people did not predict what was going to happen in
the future. The danger of the hindsight bias is that it gives an illusion of understanding.

Additional information on hindsight bias

https://www.youtube.com/watch?v=Ybj1RQ7lncU

The Negativity Bias: Bad Is More Memorable Than Good. The negativity bias is our automatic
tendency to notice and remember negative information better than positive information. For example,
unpleasant odors are perceived as more intense and evoke stronger emotional reactions than pleasant
odors (Royet, Plailly, Delon-Martin, Kareken, & Segebart, 2003). It's easier and faster for us to find an
angry face hidden among happy faces than a happy face hidden among sad faces (Hansen & Hansen,
1988). "Better safe than sorry" seems to be the guiding motto of the negativity bias (Fiedler, Freytag, &
Meiser, 2009; Öhman & Mineka, 2001)—it's good for us to pay attention to people in our environment who
are angry, to bad smells, and so on.

The automatic negativity bias can be both good and bad. It's good when it keeps us alive but it can
be exhausting! If you live in tornado alley within the United States, then you are keenly aware that dark
clouds and high winds sometimes predict devastating tornados. People with a greater negativity bias will
be more likely to find shelter and survive. You may only get to be wrong once about a tornado. The "key
organizing principle" for managing our negativity bias seems to be to "minimize danger and maximize
reward ... at each point in time" (Williams et al., 2009, p. 804). Or, as social psychologists keep reminding
us, our behavior and thought processes depend on the Situation.

Negativity Bias further explained:

https://www.youtube.com/watch?v=E09077HRurg

REFERENCES

Textbooks

Heinzen, T., & Goodfriend, W., (2019). Social Psychology


Bhrem, S. & Kassin, S. (1996). Social Psychology: Third Edition

Online References
https://www.youtube.com/watch?v=E09077HRurg
ww.investopedia.com
https://www.youtube.com/watch?v=Ybj1RQ7lncU

PSYC 1103- Social Psychology| 15


https://www.youtube.com/watch?v=6xMaR8au-YU
https://www.google.com/search?q=logic+vs+intuition&sxsrf=ALeKk02gFK_zeeazFqk7d-
sxoT33UP2dvg:1613215170995&source
https://www.verywellmind.com/what-is-a-confirmation-bias-2795024

Learning Materials
Worksheets (teacher-made)

PSYC 1103- Social Psychology| 16

You might also like