You are on page 1of 16

Cognition and Emotion

ISSN: 0269-9931 (Print) 1464-0600 (Online) Journal homepage: http://www.tandfonline.com/loi/pcem20

Perceiving emotions: Cueing social categorization


processes and attentional control through facial
expressions

Elena Caadas, Juan Lupiez, Kerry Kawakami, Paula M. Niedenthal & Rosa
Rodrguez-Bailn

To cite this article: Elena Caadas, Juan Lupiez, Kerry Kawakami, Paula M. Niedenthal &
Rosa Rodrguez-Bailn (2016) Perceiving emotions: Cueing social categorization processes and
attentional control through facial expressions, Cognition and Emotion, 30:6, 1149-1163, DOI:
10.1080/02699931.2015.1052781

To link to this article: http://dx.doi.org/10.1080/02699931.2015.1052781

Published online: 21 Jul 2015.

Submit your article to this journal

Article views: 304

View related articles

View Crossmark data

Citing articles: 1 View citing articles

Full Terms & Conditions of access and use can be found at


http://www.tandfonline.com/action/journalInformation?journalCode=pcem20

Download by: [University of Burgos - Bteca] Date: 05 April 2017, At: 03:53
COGNITION AND EMOTION, 2016
Vol. 30, No. 6, 11491163, http://dx.doi.org/10.1080/02699931.2015.1052781

Perceiving emotions: Cueing social categorization


processes and attentional control through facial
expressions
Elena Caadas1, Juan Lupiez1, Kerry Kawakami2, Paula M. Niedenthal3, and
Rosa Rodrguez-Bailn4
1
Department of Experimental Psychology, University of Granada, Granada, Spain
2
Department of Psychology, York University, Toronto, ON, Canada
3
Laboratory of Cognitive and Social Psychology, Blaise Pascal University, Clermont-Ferrand, France
4
Department of Social Psychology, University of Granada, Granada, Spain
(Received 10 March 2014; accepted 14 May 2015)
Individuals spontaneously categorise other people on the basis of their gender, ethnicity and age.
But what about the emotions they express? In two studies we tested the hypothesis that facial
expressions are similar to other social categories in that they can function as contextual cues to control
attention. In Experiment 1 we associated expressions of anger and happiness with specific proportions
of congruent/incongruent flanker trials. We also created consistent and inconsistent category members
within each of these two general contexts. The results demonstrated that participants exhibited a
larger congruency effect when presented with faces in the emotional group associated with a high
proportion of congruent trials. Notably, this effect transferred to inconsistent members of the group.
In Experiment 2 we replicated the effects with faces depicting true and false smiles. Together these
findings provide consistent evidence that individuals spontaneously utilise emotions to categorise
others and that such categories determine the allocation of attentional control.

Keywords: Categorization; Attentional control; Emotion; Social-context-specific proportion congru-


ency effect.

Emotional expressions of others play an important state of mind of the individual who expresses the
role in peoples everyday situations, especially in emotion and changes in the observers own beha-
social interactions (Nachson, 1995). Facial expres- viour to deal with the situation (Adams, Ambady,
sions communicate information about the indi- Macrae, & Kleck, 2006; Anderson & Thompson,
vidual and the environment that elicits rapid 2004; Hareli & Hess, 2012; Keltner & Kring,
responses in the observer (Niedenthal & Brauer, 1998; Marsh, Kozak, & Ambady, 2007) Although
2012). Such responses include inferences about the social psychologists have explored the types of

Correspondence should be addressed to: Elena Caadas, Department of Organizational Behavior, University of Lausanne,
Btiment Internef, 1015 Lausanne, Switzerland. E-mail: elena.canadas@unil.ch
Elena Caadas is currently at the University of Lausanne, Switzerland. Paula M. Niedenthal is now at the University of
Wisconsin-Madison, USA.

2015 Taylor & Francis 1149


CAADAS ET AL.

information conveyed by facial expressions, less is stimulus is uninformative (i.e., 50% congruent and
known about whether and how facial expressions 50% incongruent), no expectancy is observed.
serve to regulate cognitive processing in observers. The role of categorical knowledge in attentional
The present set of experiments was designed to control has been suggested by studies showing that
explore whether facial expressions of emotion can the proportion congruency effect transfers to pre-
cue attentional control similarly to classic social viously unseen, but categorically related items.
categories such as gender (Fiske & Neuberg, 1990). Bugg, Jacoby, and Chanani (2011) used a Stroop
picture-word task in which participants were
instructed to name the word aloud. Four animal
ATTENTIONAL CONTROL AND words (i.e., bird, cat, dog and fish) and their
SOCIAL CATEGORIES corresponding pictures were paired and associated
with either a high or low proportion of congruency.
Attentional control can be assessed by the extent to For example, pictures related to birds and cats were
which it moderates performance on interference paired with a high proportion of congruent trials
tasks. In a standard flanker task, for example, the (75%), whereas pictures related to dogs and fish
speed with which individuals process a central were paired with a low proportion of congruent
target is affected by the congruence between the trials (25%). The two sets of stimuli (pictures and
target and surrounding distractors (Eriksen & words) were allowed to overlap. In an additional
Eriksen, 1974). Responses indicative of the direc- block, in which the proportion of congruency was
tion of a target arrow are faster when surrounding neutral (50%), the same words were paired with
arrows point in the same (i.e., congruent) direction pictures of new exemplars from the same (trained)
compared to the opposite (i.e., incongruent) direc- animal categories. Results replicated previous find-
tion. This is known as the congruency effect (e.g., ings by showing a larger congruency effect for items
Stroop, 1935). However, the automatic interference associated with a high than with a low proportion of
caused by incongruent stimulus displays is modu- congruent trials (Crump & Milliken, 2009). This
lated by the proportion of incongruent trials pre- effect occurred not only with the previously pre-
sented in the task. Specifically, the congruency sented pictures but also with novel stimuli for which
effect is attenuated when proportionally more trials congruency was neutral. These findings indicate
contain incongruent stimuli, suggesting that dis- that category knowledge allocates attentional con-
tractor processing is the greatest when the likeli- trol in a flexible and automatic way independently
hood of congruent trials is high (Gratton, Coles, & of task demands.
Donchin, 1992; Lowe & Mitterer, 1982). This Although categorization processes occur in a
effect is known as the proportion congruency effect range of contexts in which individuals have high
(Logan & Zbrodoff, 1979). levels of perceptual expertise, categorization of
In order to address the possibility that awareness social stimuli is ubiquitous (Brewer, 1988; Cuddy,
of the contingencies is the primary mechanism for Fiske, & Glick, 2004; Kawakami, Dion, & Dovi-
attentional control cueing, researchers have exam- dio, 1998; Nelson, 2005). Human faces in particu-
ined the impact of stimuli (item-specific control; lar are important cues for attentional control
(Jacoby, Lindsay, & Hessels, 2003) and contexts because they provide valuable information about
(context-specific control; (Crump, Gong, & Milli- potential interaction partners (Johnson & Morton,
ken, 2006) on this effect. Findings from such 1991). For example, in two separate experiments
studies suggest that manipulation of the proportion using a modified flanker task, we demonstrated that
of congruent trials leads to the development of participants spontaneously use the gender of faces
specific implicit expectancies about the need for as a context to flexibly control attention (Canadas,
control in subsequent trials (Braver, Gray, & Rodriguez-Bailon, Milliken, & Lupianez, 2013).
Burgess, 2007). However, if the overall proportion Specifically, participants in these studies were
of congruency associated with a specific task presented with an image of a face and a stimulus

1150 COGNITION AND EMOTION, 2016, 30 (6)


EMOTIONS CAN CUE SOCIAL CATEGORIZATION

array consisting of five arrows above or below the (Canadas et al., 2013), expertise (Tanaka, 2001;
face on each trial. The faces served as the context for Tanaka & Taylor, 1991) and familiarity (Canadas,
the stimulus arrays. In the congruent conditions, all Lupiaez, Niedenthal, Rychlowska, & Rodrguez-
five arrows pointed in the same direction. In the Bailn, in preparation) can moderate the extent to
incongruent conditions, the central arrow and the which social categories or individual faces influence
four flanking distractors pointed in opposite direc- the allocation of attentional control. Together these
tions. Participants were instructed to respond as results suggest that although general categorization
quickly and accurately as possible to the direction of processes underlie person perception, such pro-
the central arrow by pressing the appropriate key. cesses can be influenced by perceiver-related factors
Critically, a given context (e.g., male faces) was (Fiske & Neuberg, 1990; Hugenberg, Young,
associated with a high proportion of congruent Bernstein, & Sacco, 2010).
trials, whereas the other context (e.g., female faces)
was associated with a low proportion of congruent
trials. As expected, reduced interference effects
The present research
were found when the arrows appeared in a context The primary goal of the present research was to
(i.e., male or female faces) associated with a low explore whether other social information conveyed
proportion of congruent stimulus arrays, indicating by human faces also serves as contextual cues for
that the social category cued the increased alloca- attentional control. While previous research has
tion of control in this context (Crump, et al., 2006). demonstrated that immediately evident visual cues
The consistency of faces within the group was related to socially relevant groups such as race/
also manipulated in these studies. In particular, one ethnicity, gender and age spontaneously cue cate-
face in each category was associated with the same gorization processes (Fiske & Neuberg, 1990;
proportion of congruency as the alternative cat- Macrae & Bodenhausen, 2000) and serve as signals
egory. For instance, a single female face was paired for attentional control (Canadas et al., 2013), we
with the proportion of congruency associated with investigated whether emotional cues function in the
male faces or a single male face was paired with the same way. In particular, in two experiments we
proportion of congruency associated with female examined the impact of emotional facial expressions
faces. Furthermore, in one of the experiments we on attentional control.
included an additional block of trials in which novel Recent theorising suggests that emotional
faces of men and women were paired with an expressions are crucial for social interactions because
unbiased proportion of congruency (50%). The they facilitate and promote communication between
results provided evidence that the proportion con- peers and signal appropriate behaviour (Harmon-
gruency effects related to consistent category faces Jones & Sigelman, 2001; Van Kleef, 2009). These
transferred to both the inconsistent category faces theorists propose that because emotional expres-
(i.e., the male face associated with the proportion of sions are perceived to be informative in social
congruency of the female faces and the female face situations, people rapidly perceive and use them in
associated with the proportion of congruency of the order to understand and react to others. Just like
male faces) and the unbiased novel faces. These social categories such as Blacks and Whites can
transfer effects indicate that participants deployed trigger approach and avoidance orientations (Kawa-
more attentional control when presented with faces kami, Phills, Steele, & Dovidio, 2007; Phills,
from the category associated with a low proportion Kawakami, Tabi, Nadolny, & Inzlicht, 2011),
versus high proportion of congruent trials regard- emotional expressions can determine such immedi-
less of the proportion of congruency associated with ate responses (Harmon-Jones & Sigelman, 2001).
a specific face. Notably, while the majority of research investigating
Moreover, subsequent research indicates that categorical processing by social psychologists has
participants motivation to pay attention to indi- been related to relatively stable characteristics such
vidual vs. category-related features of faces as race/ethnicity, gender, socio-economic status and

COGNITION AND EMOTION, 2016, 30 (6) 1151


CAADAS ET AL.

age, emotions may to some extent be assumed to be congruency. We also expected to find similar
more context-dependent and variable. Even though congruency effects for all faces expressing the same
these visible cues may be perceived to be less emotion regardless of whether a particular face was
predictive of underlying personality traits, because of consistent or inconsistent with the congruency
their importance in person perception (Hareli & proportion for that emotion. Specifically, we
Hess, 2012), they have the potential to spontaneously hypothesised that participants responses would be
prompt categorical processing. In the present determined by the congruency proportions asso-
research we investigated whether facial expression ciated with the emotional categories (i.e., happy vs.
related to anger, happiness and true and false smiles angry or true vs. false smiles) rather than by
serve as cues of attentional control and result in congruency proportions associated with the face
categorical responding to even inconsistent members. identity. Together the results from Experiments 1
Specifically, in this research we associated facial and 2 have the potential to provide novel support for
expressions of emotion with a particular proportion the spontaneous categorization of faces by emotion
of congruency in flanker task stimulus arrays. In impacting on the allocation of attentional control.
Experiment 1, we examined whether two distinct
emotions with opposing valence served as contexts
for the allocation of attentional control and produce EXPERIMENT 1
a proportion congruency effect similar to the one
associated with more stable social categories such as In Experiment 1 we associated different emotional
gender. In Experiment 2, we investigated whether contexts (i.e., happy vs. angry faces) with different
the proportion congruency effect occurs when subtle proportions of congruent trials in a flanker task. For
differences in affective expressions are used as half of the participants, angry faces were associated
context. Specifically, in Experiment 1 we examined with a high proportion of congruent trials and happy
responses to happy versus angry faces and in faces were associated with a low proportion of
Experiment 2 we examined responses to Duchenne congruent trials. For the other half of the partici-
versus non-Duchenne smiles. Although these lat- pants, these proportions were reversed. We expected
ter emotions related to true and false smiles have that, for both facial expressions, the congruency
overlapping and disparate structural features effect (i.e., the difference in reaction times for
(Ekman & Friesen, 1982; Frank, Ekman, & incongruent minus congruent faces) would be
Friesen, 1993), research has demonstrated that higher for the emotional context associated with a
people are adept at identifying and differentiating high than low proportion of congruent trials. In
between these smiles (Ambadar, Cohn, & Reed, addition, we created consistent and inconsistent
2009; Hess & Kleck, 1994; Krumhuber, Manstead, category members within each of these two con-
Cosker, Marshall, & Rosin, 2009; Maringer, Krum- texts. Specifically, consistent category members
huber, Fischer, & Niedenthal, 2011; Miles & were made up of three face identities related to one
Johnston, 2007) probably because of the divergent emotional expression (e.g., angry) and associated
social meaning associated with them (Niedenthal, with the same proportion of congruent trials (e.g.,
Mermillod, Maringer, & Hess, 2010). high). Inconsistent category members, alternatively,
We predicted that both displays of distinct was a single face identity related to the same
emotions (happy vs. angry) and of more subtle emotional expression (e.g., angry), but associated
differences in emotions (true vs. false smiles) would with the proportion of congruency (e.g., low) that
cue social categorization processes and thereby was typical of the opposing emotional category (e.g.,
determine the allocation of attentional resources. happy). In our first experiment, we used angry and
Specifically, for both studies, we predicted higher happy expressions for several reasons. First, the
congruency effects for emotional expressions asso- expression of happiness and anger involve the
ciated with a high proportion of congruency than contraction of different facial muscles (Ekman,
for expressions associated with a low proportion of Friesen, & Hager, 2002). Second, these emotions

1152 COGNITION AND EMOTION, 2016, 30 (6)


EMOTIONS CAN CUE SOCIAL CATEGORIZATION

also differ in valence. Evidence suggests that differ-


ent brain regions are involved in processing indi-
vidual emotions and/or facial expressions with
different valences. Finally, analyses of self-reports
have confirmed that such emotions are negatively
correlated (e.g., Bonanno & Keltner, 2004; Ekman,
Friesen, & Ancoli, 1980) and generate contradict-
ory impressions (Willis & Todorov, 2006). Happi-
ness and anger expressions are thus associated with
two very distinct emotional states and underlying
neural bases and therefore are expected to trigger
categorical processing of social stimuli and serve as
contexts for the allocation of attentional control.
Importantly, we expected the difference in the
congruency index between the high vs. low propor-
tion congruent trials to be significant in the same
direction for both consistent and inconsistent Figure 1. Panel A are examples of angry (left) and happy (right)
members, providing further evidence for the use of facial stimuli used in Experiment 1. Panel B are examples of false
emotional stimuli as spontaneous social categories. smile (left) and true smile (right) stimuli used in Experiment 2.
Reproduced with permission.

Method the emotion of each target person. Over 80% of


participants identified each emotion correctly.
Participants In accordance with previous research on social
Thirty-six undergraduate students (six males) at a categories (Canadas et al., 2013), the experiment
Canadian university (M = 22.9 years) participated included a modification of the Eriksen flanker task
in the experiment for course credit. All partici- (Eriksen & Eriksen, 1974) in which each trial
pants were nave to the purpose of the study. consisted of an initial 200 ms fixation cross
followed by the central presentation of an emo-
tional face representing the context (see Figure 2).
Procedure
After a 400 ms interval, five arrows appeared above
Stimulus presentation, timing and data collection or below the face for 2000 ms or until the
were controlled using the E-prime 2.0 software participant responded. In the congruent condition,
package, run on a standard Pentium 4 PC. Stimuli all five arrows pointed in the same direction. In the
were presented on a 17 computer screen and incongruent condition, the central arrow and the
consisted of 16 different photographs of women four flanking distractors pointed in opposite direc-
from the NimStim Set of Facial Expressions (http:// tions. Participants were instructed to respond as
www.macbrain.org/resources.htm). The faces of quickly and accurately as possible to the direction of
eight different models each portrayed a happy and the central arrow by pressing either the Z (left) or
angry expression (see Figure 1, Panel A). M (right) key. Participants were further instructed
Each participant was randomly assigned to one to carefully attend to all faces because at the end of
of four counterbalanced conditions and was pre- the experiment they would be asked questions
sented with eight of the 16 photographs. Four pertaining to these faces. The inter-stimulus inter-
photographs depicted happy expressions and four val was 1000 ms and participants were given breaks
photographs (from different models) depicted angry between each of five blocks.
expressions. In an initial pilot study, a separate As previously noted, three of the four face
group of participants (N = 20) was instructed to label identities in each category were consistent with the

COGNITION AND EMOTION, 2016, 30 (6) 1153


CAADAS ET AL.

Figure 2. Presentation latencies in Experiment 1. Panel A is an example of a congruent anker trial with a happy context face. Panel B is
an example of an incongruent trial with an angry context face. Reproduced with permission.

proportion of congruency associated with the participant with an excessive proportion of errors
category, while the fourth face identity was incon- (i.e., more than 20%) and four participants who
sistent (see Figure 3). The group and the specific experienced technical problems with the computer
face identity associated with either a high or low programme were also excluded from the analysis.
proportion of congruency was randomly selected Analysis of the mean response latencies revealed
and counterbalanced across participants. Specific- a congruency effect, F(1, 30) = 106.07, p < .001; 2
ally, we created four different counterbalanced = .78. As expected, participants took longer to
orders so that each group of faces was associated respond on incongruent (M = 651.03 ms; SD =
equally often with a high or low proportion of 90.53; 95% CI [72.53, 121.01]) than on congruent
congruency. Furthermore, each specific face iden- (M = 567.15 ms; SD = 64.89; 95% CI [51.86,
tity was associated equally often with the consist- 86.74]) trials. To test our specific hypotheses
ent and inconsistent condition. All participants related to the impact of target consistency on this
performed one initial practice block consisting of congruency effect we computed an index of con-
16 trials followed by five experimental blocks gruency (i.e., incongruent minus congruent trials;
consisting of 128 trials each. Funes, Lupianez, & Humphreys, 2010). The index
was then submitted to a repeated measure ANOVA
that included group congruency (high vs. low) and
Results individual face consistency (category consistent vs.
The practice and the first block were considered inconsistent) (see Table 1a).
learning trials and this data was not included in A main effect of group congruency was
the analysis. Errors (3.1%) and outliers (i.e., observed, F(1, 30) = 6.10, p = .019; 2 = .17, which
latencies 2.5 SD above or below the mean, 1.8%) was not qualified by individual face consistency,
were excluded from the analyses. Data from one F(1, 30) = 1.12, p = .30; 2 = .04. As expected, larger

1154 COGNITION AND EMOTION, 2016, 30 (6)


EMOTIONS CAN CUE SOCIAL CATEGORIZATION

Figure 3. Example of a trail when false smile faces were paired with high proportion congruent trials and true smiles faces were paired
with low proportion congruent trials. Reproduced with permission.

congruency effects were found for the high (M = Esteves, 2001; Vuilleumier & Schwartz, 2001). To
88.23 ms; SD = 49.94; 95% CI [39.91, 66.65]) examine this possibility, we explored whether the
compared to low (M = 79.83 ms; SD = 48.67; 95% pattern of results described above was moderated
CI [38.89, 65.06]) group congruency conditions by target emotion. Specifically, we conducted a
and this pattern did not differ for consistent and type of emotion (happy vs. angry for the high
inconsistent faces. proportion of congruency) by individual face con-
Furthermore, a corresponding analysis of error sistency (category consistent vs. inconsistent) by
rates revealed only a significant main effect of group congruency (high vs. low proportion con-
congruency, F(1, 30) = 33.50, p < .001; 2 = .53,
gruent) ANOVA on the index of congruency with
such that participants made more mistakes on
the first factor between subjects and the last two
incongruent trials (M = 3.76%; SD = 3.93; 95% CI
[3.14, 5.26]) than on congruent trials (M = 0.94%; factors within subject. As expected, while the main
SD = 1.86; 95% CI [1.48, 2.48]). No other main effect of group congruency was significant, F(1, 29)
effects or interactions were significant, all = 6.01, p = .02; 2 = .17, this effect was not
Fs < 1 (see Table 1b). qualified by either the consistency of the individual
Recent evidence suggests that different emo- faces, F(1, 29) = 1.13, p = .30; 2 = .04, or the type
tional expressions drive and capture attention of emotion in the high and low congruency trials,
differently (Fox et al., 2000; Ohman, Flykt, & F(1, 29) < 1; 2 = .009.

Table 1a. Congruency effect in ms for Experiments 1 and 2 (standard error in brackets)

Consistent faces Inconsistent faces Social-context-specific PCE (HPCLPC)


(CF) (IF)
Experiment HPC LPC HPC LPC CF IF

Experiment 1
Congruency effect (IC) 85 (9.0) 79 (8.5) 92 (8.9) 80 (9.0) 5 (4.6) 12 (5.0)
Experiment 2
Congruency effect (IC) 102 (7.6) 91 (7.7) 107 (10.7) 93 (7.8) 11 (5.8) 14 (10.3)

HPC, high proportion of congruent trials; LPC, low proportion of congruent trials; I, incongruent; C, congruent.

COGNITION AND EMOTION, 2016, 30 (6) 1155


CAADAS ET AL.

Table 1b. Congruency effect in error rates for Experiments 1 and 2 (standard error in brackets)

Consistent faces Inconsistent faces Social-context-specific PCE (HPCLPC)


(CF) (IF)
Experiment HPC LPC HPC LPC CF IF

Experiment 1
Congruency effect (IC) 2.6 (.7) 2.4 (.5) 3.1 (.7) 3.2 (.8) 0.2 (.6) 0.1 (1.1)
Experiment 2
Congruency effect (IC) 4.0 (.9) 5 (1.0) 3.6 (.9) 4.2 (1.0) 1 (0.6) 0.5 (.8)

HPC, high proportion of congruent trials; LPC, low proportion of congruent trials; I, incongruent; C, congruent.

Discussion distinct emotions in Experiment 1. We used facial


expressions that are assumed to belong to the same
The primary aim of Experiment 1 was to explore
general type of emotional expression, but which
whether distinct emotional expressions (i.e., hap-
seem to communicate different meanings (Nie-
piness vs. anger) function as contextual cues to
denthal et al., 2010) from happiness or joy (Frank
allocate attentional control. Our findings provide
et al., 1993; Frank & Stennett, 2001) to affiliation
initial support for the extension of the social-
or dominance (Abel, 2002; Fogel, Nelson-Goens,
context-specific proportion congruency effect to
Hsu, & Shapiro, 2000; Keltner, Moffitt, &
the domain of emotional expressions. In particular,
Stouthamer-Loeber, 1995; LaBarre, 1947; Tipples,
emotional expressions associated with a high pro-
Atkinson, & Young, 2002). Specifically, in this
portion of congruent trials showed a larger congru-
study we included Duchenne (true smiles) and non-
ency effect than those associated with a low
Duchanne (false or social smiles) as our social
proportion of congruent trials. Moreover, in
accordance with previous findings (Canadas et al., context. Our aim was to investigate whether the
2013), we also found that the social-context- social-context-specific proportion congruency effect
specific proportion congruency effect observed in observed in Experiment 1with distinct emotions
the consistent instances of each emotion group and in our previous research on gender (Canadas
(e.g., three happy faces paired with congruent et al., 2013; in preparation) would replicate when
flanker trials) generalised to inconsistent members more subtly different expressions were used as social
(i.e., one happy face paired with incongruent contexts. We predicted that because people place
flanker trials); in fact the effect was even larger for importance on learning to detect false or deceitful
inconsistent faces (see Table 1). These results actions in others, they would spontaneously use true
suggest the existence of a fast categorization process and false smiles to categorise people and process
related to emotions in which mere exposure to eight social contexts (Buck, 1988).
unfamiliar faces with divergent emotional expres- Previous work has identified the structural and
sions is sufficient to use these emotions as a basis for dynamic features of true vs. false smiles (Bernstein,
categorization and influence the subsequent alloca- Sacco, Brown, Young, & Claypool, 2010; Cacioppo,
tion of attentional control. Petty, Losch, & Kim, 1986; Ekman, et al., 1980;
Frank, 2002; Hess & Kleck, 1994; Maringer et al.,
2011). In particular, in still photographs the Duch-
EXPERIMENT 2 enne marker distinguishes these expressions. In
true smiles, there is a pronounced contraction of
In Experiment 2 our goal was to explore whether the orbicularis oculi, the muscle around the eyes,
subtypes of the same facial expression are sufficient which causes a lift of the cheeks, a narrowing of the
to replicate the pattern of findings related to the eye opening, and consequently the appearance of

1156 COGNITION AND EMOTION, 2016, 30 (6)


EMOTIONS CAN CUE SOCIAL CATEGORIZATION

wrinkles around the eyes. In sum, given the social stimuli associated with high or low proportion of
meaning of true vs. false smiles, the personality traits congruency were randomly selected and counter-
associated with them, and individuals ability to balanced across participants.
correctly identify and differentiate between these Participants were presented with eight of the
expressions (Ambadar et al., 2009; Hess & Kleck, 16 photographs four depicting true smiles and
1994; Krumhuber et al., 2007; Maringer et al., 2011; four (from different models) depicting false smiles.
Miles & Johnston, 2007), we expected participants They performed one practice block consisting of
to automatically categorise facial stimuli on the basis 16 trials, followed by five experimental blocks of
of these two types of smiles and consequently to 128 trials each.
differentially allocate attentional control. In a pilot study, a separate group of participants
To examine this hypothesis, participants were (N = 30) was asked to rate each face according to its
presented with true or false smiles. While one type trustworthiness using a 1 (not at all trustworthy) to
of smile (e.g., true smiles) was associated with a 7 (very trustworthy) Likert scale. Participants rated
high proportion of congruency, the other type (e.g., faces with true smiles (M = 4.93, SD = .73) as being
false smiles) was presented with a low proportion of significantly more trustworthy and therefore true
congruency. If individuals spontaneously use these than faces with false smiles (M = 3.39, SD = .89),
subtle cues to categorise people according to t(29) = 7.21, p <.001. The faces were however
emotion, they should show a larger congruency equally matched in attractiveness, F(1, 29) = 1.04,
effect for the type of smile associated with a high p = .42.
(vs. low) proportion of congruent trials. Further-
more, based on previous findings, we expected this
Results
pattern even for inconsistent faces, which are
associated with the opposite proportion of congru- Consistent with the analytic strategy used in
ency as the rest of the category exemplars. Experiment 1, the practice block and the first
block were considered learning trials and there-
fore, data from these blocks were not included in
Method the analyses. Trials with errors (2.7%) and outliers
(i.e., response latencies 2.5 SD above or below the
Participants
mean, 1.8%) as well as data from one participant
Thirty-four students (one male) at a French with excessive error rates (i.e., more than 20%)
university (M = 20.0 years; SD = 2.06) participated were also excluded from the analyses.
in the experiment for course credit. All partici- A preliminary analysis of mean response laten-
pants were nave to the purpose of the study. cies revealed a significant congruency effect, F(1,
32) = 200.75, p < .001; 2 = .86. In accordance with
Procedure previous findings, participants took longer to
The procedure was similar to Experiment 1, except respond on incongruent (M = 677.75 ms; SD =
that the emotional stimuli in Experiment 2 were 82.18; 95% CI [66.09, 108.69]) than congruent (M
composed of 16 photographs of eight men portray- = 578.75 ms; SD = 71.90; 95% CI [57.82, 95.10])
ing true and false smiles. These expressions were trials. As in Experiment 1, we computed an index
created using standard instructions (Ekman & of congruency (incongruent minus congruent trials)
Davidson, 1993) to elicit different smiles (see and included it to a repeated measures ANOVA
Figure 1, Panel B). Three consistent category with group congruency (high vs. low) and Indi-
members and one inconsistent category member vidual Face Consistency (consistent vs. inconsistent
within each of the two groups of true and false with the category) as within subject factors.
smiling facial expressions were created in accord- As expected, the main effect of group congru-
ance with the procedures described in Experiment ency was significant, F(1, 32) = 5.76, p = .022; 2 =
1. The emotional expression and the specific .15, and the interaction between group congruency

COGNITION AND EMOTION, 2016, 30 (6) 1157


CAADAS ET AL.

and individual face consistency was not significant, the faces. That is, participants applied the same
F(1, 32) < 1; 2 = .01. In accordance with our cognitive control to consistent true smiles asso-
predictions, the congruency effect was larger in the ciated with a high proportion of congruent trials
high proportion congruent condition (M = 104.71 also to inconsistent true smiles associated with a
ms; SD = 52.67; 95% CI [42.36, 69.67]) than the low proportion congruent trial. The fact that we
low proportion congruent condition (M = 92.13 used the same category of expression (i.e., smiles)
ms; SD = 44.50; 95% CI [35.78, 58.85]) and this with subtle markers that differentiated true from
pattern did not differ for consistent and inconsist- false smiles suggests that individuals attend closely
ent faces, F (1, 32) < 1; 2 = .01. to the emotional expressions of others. Because
Furthermore, a corresponding analysis of error people are experts at identifying others expres-
rates showed only a significant main effect of sions, even when emotions are not directly relevant
Congruency, F (1, 32) = 26.69, p < .001; 2 = .45. to the task at hand or even when they are
Specifically, participants made more mistakes on associated with very subtle cues, they are used to
incongruent (M = 4.85%; SD = 5.20; 95% CI [4.18, define the social contexts.
6.88]) than congruent (M = 0.66%; SD = 1.44; 95% Together the results in both Experiments 1 and
CI [1.16, 1.91]) trials. Furthermore, the interaction 2 lead us to conclude that emotional expressions
between Group Congruency and Individual Face (i.e., angry vs. happy and true vs. false smiles)
Consistency was not significant, F (1, 32) < 1; 2 function as do other important social categories
= .00. such as sex, that is, they serve as cues for the
To further explore whether true and false categorization of faces and the subsequent contex-
smiles drive the capture of attention differently tual allocation of attentional control. Given that the
(Fox, 2002; Ohman, et al., 2001; Vuilleumier & methods of the two experiments were nearly
Schwartz, 2001), we conducted a Type of Smile identical (with only the type of emotion and related
(true vs. false for the high proportion of congru- stimuli differing), we performed an overall analysis
ency) by Individual Face Consistency (category to increase statistical power. The results of the
consistent vs. category inconsistent) by Group jointed analysis (group congruency face consist-
Congruency (high vs. low proportion congruent) ency experiment) demonstrated a strong main
ANOVA on the index of congruency with the effect of group congruency [F(1, 62) = 11.07, p =
first factor as a between subjects variable and the .001; 2 = .15; observed power = .91]. Furthermore,
last two factors as within subject variables. We once again and as predicted, the group congruency
found a main effect of group congruency, F(1, 31) face consistency interaction was not significant [F
= 5.65, p = .024; 2 =.15, that was not qualified by (1, 62) < 1, p = .47, 2 = .00; observed power = .01].
either face consistency, F(1, 31) < 1; 2 = .002, or
type of smile, F(1, 31) = 1.31, p = .26; 2 = .04.
GENERAL DISCUSSION
Discussion
The present experiments investigated the possib-
The results of Experiment 2 demonstrated that ility that facial expressions of emotion function as
the categorization processes and the associated contextual cues to allocate attentional control. In
pattern of allocation of attentional control found Experiment 1, we focused on angry and happy
in Experiment 1 replicated with more subtle expressions. These expressions are considered to
emotional expressions. The observed pattern of be opposite in conveyed valence, to involve differ-
results replicated the findings of Experiment 1. ent facial muscles and to produce different emo-
Specifically, we showed that categorization effects tional responses (Ekman et al 2002; Russell &
related to emotional expressions as indexed by Carroll, 1999; Winkielman, Berridge, & Wilbar-
social-context-specific proportion congruency ger, 2005). In Experiment 2, we focused on true
effects occur independent of the consistency of and false smiles, which are associated with the

1158 COGNITION AND EMOTION, 2016, 30 (6)


EMOTIONS CAN CUE SOCIAL CATEGORIZATION

same expression but have subtly different facial individuate rather than categorise others when
features that convey distinct social meaning about they are given explicit instructions to attend to
the trustworthiness of the target person. others (Canadas et al., 2013), we recommend that
Together, the results of Experiments 1 and 2 future research further explores the mechanisms
suggest that participants are sensitive to emotional that underlie individuation-categorization pro-
expression in face perception and categorise faces cesses to better understand when people will and
accordingly; these categories and the proportion of will not apply attentional control.
congruency associated with them determine the Even though the present experiments provide
allocation of attentional resources. Specifically, the convincing evidence that distinct and subtle emo-
findings indicate that participants used facial tional expressions support the categorization of
expressions rather than individual identities when faces, it is unclear whether the way perceivers react
performing the attentional task. A clear pattern to specific expressions and their ability to fully
emerged across the two experiments: happy vs. discriminate between these emotions plays an
angry faces and true vs. false smiles served as cues important role in these processes (Fiske et al.,
to categorise faces. Together these findings indic- 1987). In particular, it is not clear the impact that
ate that participants were able to identify both embodiment of these emotions and empathising
explicit and more subtle affective expressions and with the target have on the extent to which these
use these emotions as contexts to cue the alloca- facial features are used in categorization processes.
tion of attentional control. These results support Because emotions and social contexts can facilitate
the classic models of social perception. According and promote future interactions between people
to such models, we process others initially accord- and signal approach or avoidance orientations
ing to social category membership (Brewer, 1988; (Marsh, Ambady, & Kleck, 2005; Harmon-Jones
Fiske & Neuberg, 1990; Hugenberg et al., 2010) & Sigelman, 2001), it is imperative to better
and only subsequently according to idiosyncratic understand these mechanisms.
characteristics. Furthermore, only when perceivers One strategy to achieve this goal is to manipulate
are motivated to look at perceived individuals in or control the emotional state of participants.
greater detail will this latter stage occur. Because the feelings of participants can interact
An alternative account of the categorization with the perception of faces presented, they can play
effect demonstrated here is that participants an important role in evaluating the genuineness of
learned the emotional contexts (i.e., 2 emotional smiles (Forgas & East, 2008). While positive mood
expressions) more easily than the individual con- can increase the perceived trustworthiness of happy
texts (i.e., 8 identities). However in a recent study faces, negative mood can decrease such judgements.
conducted in our laboratory we presented again 8 Accordingly, it would be interesting in future
stimuli; this time 4 animals and 4 tools, instead of research to examine whether inducing context-
two groups of faces, and in this case participants congruent emotions in comparison to a neutral
showed attentional control based on the specific state would increase categorization and congruency
proportion of congruency associated to the stimu- effects. Alternatively, blocking facial expressions
lus and not to that proportion of congruency and inhibiting mimicry of perceived emotions
associated to the group.1 Although previous stud- (Niedenthal, et al., 2010) may work to reduce
ies have also demonstrated that participants can categorization and congruency effects.

1
Categorisation is not always the outcome when using this paradigm. As part of a doctoral dissertation (accessible at:
http://digibug.ugr.es/bitstream/10481/25133/1/21595574.pdf) we conducted a study using the same basic procedure as the
two experiments presented here but that included as stimuli four animals and four tools (instead of two groups of faces). The
results of this study showed a significant group stimuli consistency interaction, F(1, 28) = 7.17, p = .012; 2 =.20, in which
the effect of the inconsistent pictures was opposite to that of the consistent pictures. Therefore in this study we found
individuation rather than categorisation. We assume that in this latter case participants learned about eight individual
contexts rather than about two categories.

COGNITION AND EMOTION, 2016, 30 (6) 1159


CAADAS ET AL.

In conclusion, emotional expressions are Anderson, C., & Thompson, L. L. (2004). Affect from
important in social interaction: they facilitate and the top down: How powerful individuals positive
promote communication between peers, and signal affect shapes negotiations. Organizational Behavior
whether to approach or avoid a person or environ- and Human Decision Processes, 95(2), 125139.
doi:10.1016/j.obhdp.2004.05.002
ment (Harmon-Jones & Sigelman, 2001). From a
Bernstein, M. J., Sacco, D. F., Brown, C. M., Young, S.
cognitive approach, emotional expressions have G., & Claypool, H. M. (2010). A preference for
been shown to selectively guide attention. In the genuine smiles following social exclusion. Journal of
present research, we utilised a multi-disciplinary Experimental Social Psychology, 46, 196199. doi:10.
approach by bringing together the literature on 1016/j.jesp.2009.08.010
social categorization processes, emotion and atten- Bonanno, G. A., & Keltner, D. (2004). The coherence
tional control. In particular, we replicated the of emotion systems: Comparing on-line measures
context-specific proportion congruency effect with of appraisal and facial expressions, and self-report.
emotional faces as context cues. These findings Cognition and Emotion, 18, 431444.
provide new evidence that emotional expressions Braver, T. S., Gray, J. R., & Burgess, G. C. (2007).
can spontaneously lead to categorization processes Explaining the many varieties of working memory
variation: Dual mechanisms of cognitive control. In
underlying the contextual allocation of attentional
A. R. A. Conway, D. Jarrold, M. J. Kane, A. Miyake,
control and other related attentional processes. & J. N. Towse (Eds.), Variation in working memory
(pp. 76106). Oxford: Oxford University Press.
Brewer, M. B. (1988). A dual process model of
Disclosure statement impression formation. In T. K. Srull & J. R. S.
No potential conflict of interest was reported by the Wyer (Eds.), Advances in social cognition (Vol. 1, pp.
authors. 136). Hillsdale, NJ: Erlbaum.
Buck, R. (1988). Human motivation and emotion. New
York, NY: Wiley.
Funding Bugg, J. M., Jacoby, L. L., & Chanani, S. (2011). Why
it is too early to lose control in accounts of item-
This research was financially supported by the
specific proportion congruency effects. Journal of
Spanish Ministry of Education, Culture and Sports Experimental Psychology: Human Perception and Per-
through a predoctoral [grant number FPU: formance, 37, 844859. doi:10.1037/A0019957
AP2007-04250] to the first author, and research Cacioppo, J. T., Petty, R. E., Losch, M. E., & Kim, H.
[grant numbers PSI2010-17877, PSI2011-22416, S. (1986). Electromyographic activity over facial
PSI2014-52764-P and PSI2013-45678-P] to the muscle regions can differentiate the valence and
second and last authors. intensity of affective reactions. Journal of Personality
and Social Psychology, 50, 260268. doi:10.1037/
0022-3514.50.2.260
REFERENCES Canadas, E., Rodriguez-Bailon, R., Milliken, B., &
Lupianez, J. (2013). Social categories as a context for
Abel, M. H. (2002). The elusive nature of smiling. In the allocation of attentional control. Journal of
M. H. Abel (Ed.), An empirical reflection on the smile Experimental Psychology: General, 142, 934943.
(pp. 113). New York, NY: Edwin Mellen Press. doi:10.1037/A0029794
Adams, R. B., Ambady, N., Macrae, C. N., & Kleck, R. Crump, M. J., Gong, Z., & Milliken, B. (2006). The
E. (2006). Emotional expressions forecast approach- context-specific proportion congruent Stroop effect:
avoidance behavior. Motivation and Emotion, 30, Location as a contextual cue. Psychonomic Bulletin
179188. doi:10.1007/s11031-006-9020-2 Review, 13, 316321.
Ambadar, Z., Cohn, J. F., & Reed, L. I. (2009). All Crump, M. J. C., & Milliken, B. (2009). The flexibility
smiles are not created equal: Morphology and timing of context-specific control: Evidence for context-
of smiles perceived as amused, polite, and embar- driven generalization of item-specific control set-
rassed/nervous. Journal of Nonverbal Behavior, 33, tings. Quarterly Journal of Experimental Psychology,
1734. doi:10.1007/s10919-008-0059-5 62, 15231532. doi:10.1080/17470210902752096

1160 COGNITION AND EMOTION, 2016, 30 (6)


EMOTIONS CAN CUE SOCIAL CATEGORIZATION

Cuddy, A. J. C., Fiske, S. T., & Glick, P. (2004). When Frank, M. G. (2002). Smiles, lies, and emotion. In
professionals become mothers, warmth doesnt cut the M. H. Abel (Ed.), An empirical reflection on the
ice. Journal of Social Issues, 60, 701718. doi:10.1111/ smile (pp. 1544). New York, NY: Edwin Mellen
j.0022-4537.2004.00381.x Press.
Ekman, P., & Davidson, R. J. (1993). Voluntary Frank, M. G., Ekman, P., & Friesen, W. V. (1993).
smiling changes regional brain activity. Psychological Behavioral markers and recognizability of the smile
Science, 4, 342345. doi:10.1111/j.1467-9280.1993. of enjoyment. Journal of Personality and Social Psy-
tb00576.x chology, 64, 8393.
Ekman, P., & Friesen, W. V. (1982). Felt, false, and Frank, M. G., & Stennett, J. (2001). The forced-choice
miserable smiles. Journal of Nonverbal Behavior, 6, paradigm and the perception of facial expressions of
238252. doi:10.1007/Bf00987191 emotion. Journal of Personality and Social Psychology,
Ekman, P., Friesen, W. V., & Ancoli, S. (1980). Facial 80(1), 7585. doi:10.1037/0022-3514.80.1.75
signs of emotional experience. Journal of Personality Funes, M. J., Lupianez, J., & Humphreys, G. (2010).
and Social Psychology, 39, 11251134. doi:10.1037/ Analyzing the generality of conflict adaptation
h0077722 effects. Journal of Experimental Psychology Human
Ekman, P., Friesen, W. V., & Hager, J. C. (2002). The Perception and Performance, 36(1), 147161. doi:10.
new Facial Action Coding System (FACS). Salt Lake 1037/A0017598
City: Research Nexus eBook (Ed.). Gratton, G., Coles, M. G., & Donchin, E. (1992).
Eriksen, B. A., & Eriksen, C. W. (1974). Effects of Optimizing the use of information: Strategic control
noise letters upon the identification of a target letter of activation of responses. Journal of Experimental
in a nonsearch task. Perception & Psychophysics, 16, Psychology, 121(4), 480506.
143149. doi:10.3758/BF03203267 Hareli, S., & Hess, U. (2012). The social signal value of
Fiske, S. T., Neuberg, S. L., Beattie, A. E., & Milberg, emotions. Cognition & Emotion, 26, 385389.
S. J. (1987). Category-based and attribute-based doi:10.1080/02699931.2012.665029
reactions to others: Some informational conditions Harmon-Jones, E., & Sigelman, J. (2001). State anger
of stereotyping and individuating processes Journal of and prefrontal brain activity: Evidence that insult-
Experimental Social Psychology, 23, 399427. related relative left-prefrontal activation is associated
Fiske, S. T., & Neuberg, S. L. (1990). A continuum of with experienced anger and aggression. Journal
impression formation, from category-based to indi- of Personality and Social Psychology, 80, 797803.
viduating processes: Influences of information and doi:10.1037/0022-3514.80.5.797
motivation on attention and interpretation. In M. P. Hess, U., & Kleck, R. E. (1994). The cues decoders use
Zanna (Ed.), Advances in experimental social psycho- in attempting to differentiate emotion-elicited and
logy (Vol. 23, pp. 1108). San Diego, CA: Academic posed facial expression. European Journal of Social
Press. Psychology, 24, 367381. doi:10.1002/ejsp.2420240
Fogel, A., Nelson-Goens, G. C., Hsu, H., & Shapiro, 306
A. (2000). Do different infant smiles reflect different Hugenberg, K., Young, S. G., Bernstein, M. J., &
positive emotions? Social Development, 9, 497522. Sacco, D. F. (2010). The categorization-individu-
Fox, E. (2002). Processing emotional facial expressions: ation model: An integrative account of the other-race
The role of anxiety and awareness. Cognitive Affective recognition deficit. Psychological Review, 117, 1168
Behavioral Neuroscience, 2(1), 5263. doi:10.3758/ 1187. doi:10.1037/A0020463
CABN.2.1.52 Jacoby, L. L., Lindsay, D. S., & Hessels, S. (2003).
Fox, E., Lester, V., Russo, R., Bowles, R. J., Pichler, A., Item-specific control of automatic processes: Stroop
& Dutton, K. (2000). Facial expressions of emotion: process dissociations. Psychonomic Bulletin Review,
Are angry faces detected more efficiently? Cognition 10, 638644. doi:10.3758/BF03196526
and Emotion, 14(1), 6192. doi:10.1080/026999300 Johnson, M., & Morton, J. (1991). Biology and cognitive
378996 development: The case of face recognition. Oxford:
Forgas, J. P., & East, R. (2008). How real is that smile? Blackwell.
Mood effects on accepting or rejecting the veracity of Kawakami, K., Dion, K. L., & Dovidio, J. F. (1998).
emotional facial expressions. Journal of Nonverbal Racial prejudice and stereotype activation. Personality
Behavior, 32, 157170. doi:10.1007/s10919-008-00 and Social Psychology Bulletin, 24, 407416. doi:10.
50-1 1177/0146167298244007

COGNITION AND EMOTION, 2016, 30 (6) 1161


CAADAS ET AL.

Kawakami, K., Phills, C. E., Steele, J. R., & Dovidio, J. Miles, L., & Johnston, L. (2007). Detecting happiness:
F. (2007). (Close) distance makes the heart grow Perceiver sensitivity to enjoyment and non-enjoy-
fonder: Improving implicit racial attitudes and inter- ment smiles. Journal of Nonverbal Behavior, 31, 259
racial interactions through approach behaviors. Jour- 275. doi:10.1007/s10919-007-0036-4
nal of Personality and Social Psychology, 92, 957971. Nachson, I. (1995). On the modularity of face recogni-
doi:10.1037/0022-3514.92.6.957 tion: The riddle of domain specificity. Journal of
Keltner, D., & Kring, A. M. (1998). Emotion, social Clinical Experimental Neuropsychology, 17, 256275.
function, and psychopathology. Review of General doi:10.1080/01688639508405122
Psychology, 2, 320342. doi:10.1037/1089-2680.2. Nelson, T. D. (2005). Ageism: Prejudice against our
3.320 feared future self. Journal of Social Issues, 61, 207
Keltner, D., Moffitt, T. E., & Stouthamer-Loeber, M. 221. doi:10.1111/j.1540-4560.2005.00402.x
(1995). Facial expressions of emotion and psycho- Niedenthal, P. M., & Brauer, M. (2012). Social
pathology in adolescent boys. Journal of Abnormal functionality of human emotion. Annual Review of
Psychology, 104, 644652. doi:10.1037/0021-843X. Psychology, 63, 259285. doi:10.1146/annurev.psych.
104.4.644 121208.131605
Krumhuber, E., Mantead, A. S. R., Cosker, D., Niedenthal, P. M., Mermillod, M., Maringer, M., &
Marshall, D., & Rosin, P. L. (2009). Effects of Hess, U. (2010). The Simulation of Smiles (SIMS)
dynamic attributes of smiles in human and synthetic model: Embodied simulation and the meaning of
faces: A simulated job interview setting. Journal of facial expression. Behavioral Brain Science, 33, 417
Nonverbal Behavior, 33, 115. 433; discussion 433480. doi:10.1017/s0140525x1
Krumhuber, E., Manstead, A. S. R., Cosker, D., 0000865
Marshall, D., Rosin, P. L., & Kappas, A. (2007). Ohman, A., Flykt, A., & Esteves, F. (2001). Emotion
Facial dynamics as indicators of trustworthiness and drives attention: Detecting the snake in the grass.
cooperative behavior. Emotion, 7, 730735. doi:10. Journal of Experimental Psychology: General, 130,
1037/1528-3542.7.4.730 466478. doi:10.1037/0096-3445.130.3.466
LaBarre, W. (1947). The cultural basis of emotions and Phills, C. E., Kawakami, K., Tabi, E., Nadolny, D., &
gestures. Journal of Personality, 16, 4969. Inzlicht, M. (2011). Mind the gap: Increasing
Logan, G. D., & Zbrodoff, N. J. (1979). When it helps associations between the self and blacks with
to be misled Facilitative effects of increasing the approach behaviors. Journal of Personality and Social
frequency of conflicting stimuli in a stroop-like task. Psychology, 100, 197210. doi:10.1037/A0022159
Memory & Cognition, 7, 166174. doi:10.3758/ Russell, J. A., & Carroll, J. M. (1999). On the bipolarity
Bf03197535 of positive and negative affect. Psychological Bulletin,
Lowe, D. G., & Mitterer, J. O. (1982). Selective and 125, 330. doi:10.1016/j.biopsych.2008.10.038
divided attention in a stroop task. Canadian Journal Stroop, J. R. (1935). Studies of interference in serial
of Psychology, 36, 684700. doi:10.1037/h0080661 verbal reactions. Journal of Experimental Psychology:
Macrae, C. N., & Bodenhausen, G. V. (2000). Social General, 18, 643662. doi:10.1037/0096-3445.121.
cognition: Thinking categorically about others. 1.15
Annual Review of Psychology, 51, 93120. doi:10. Tanaka, J. W. (2001). The entry point of face recogni-
1146/annurev.psych.51.1.93 tion: Evidence for face expertise. Journal of Experi-
Maringer, M., Krumhuber, E. G., Fischer, A. H., & mental Psychology: General, 130, 534543. doi:10.
Niedenthal, P. M. (2011). Beyond smile dynamics: 1037/0096-3445.130.3.534
Mimicry and beliefs in judgments of smiles. Emotion, Tanaka, J. W., & Taylor, M. (1991). Object categories
11, 181187. doi:10.1037/a0022596 and expertise: Is the basic level in the eye of the
Marsh, A. A., Ambady, N., & Kleck, R. E. (2005). The beholder? Cognitive Psychology, 23, 457482. doi:10.
effects of fear and anger facial expressions on 1016/0010-0285(91)90016-H
approach- and avoidance-related behaviors. Emotion, Tipples, J., Atkinson, A. P., & Young, A. W. (2002).
5, 119124. doi:10.1037/1528-3542.5.1.119 The eyebrow frown: A salient social signal. Emotion,
Marsh, A. A., Kozak, M. N., & Ambady, N. (2007). 2, 288296. doi:10.1037/1528-3542.2.3.288
Accurate identification of fear facial expressions Van Kleef, G. A. (2009). How emotions regulate social
predicts prosocial behavior. Emotion, 7, 239251. life. Current Directions in Psychological Science, 18,
doi:10.1037/1528-3542.7.2.239 184188. doi:10.1111/j.1467-8721.2009.01633.x

1162 COGNITION AND EMOTION, 2016, 30 (6)


EMOTIONS CAN CUE SOCIAL CATEGORIZATION

Vuilleumier, P., & Schwartz, S. (2001). Emotional Winkielman, P., Berridge, K. C., & Wilbarger, J. L.
facial expressions capture attention. Neurology, 56, (2005). Unconscious affective reactions to masked
153158. doi:10.1212/WNL.56.2.153 happy versus angry faces influence consumption
Willis, J., & Todorov, A. (2006). First impressions: behavior and judgments of value. Personality and
Making up your mind after a 100-ms exposure to a Social Psychology Bulletin, 31, 121135. doi:10.1177/
face. Psychological Science, 17, 592598. doi:10.1111/ 0146167204271309
j.1467-9280.2006.01750.x

COGNITION AND EMOTION, 2016, 30 (6) 1163