You are on page 1of 13

REVIEW ARTICLE

published: 02 November 2012


doi: 10.3389/fpsyg.2012.00471

Faces in context: a review and systematization of


contextual influences on affective face processing
Matthias J. Wieser 1 * and Tobias Brosch 2
1
Department of Psychology, University of Wrzburg, Wrzburg, Germany
2
Department of Psychology, University of Geneva, Geneva, Switzerland

Edited by: Facial expressions are of eminent importance for social interaction as they convey informa-
Linda Isaac, Palo Alto VA and Stanford
tion about other individuals emotions and social intentions. According to the predominant
University, USA
basic emotion approach, the perception of emotion in faces is based on the rapid, auto-
Reviewed by:
Richard J. Tunney, University of matic categorization of prototypical, universal expressions. Consequently, the perception of
Nottingham, UK facial expressions has typically been investigated using isolated, de-contextualized, static
Maria Pia Viggiano, University of pictures of facial expressions that maximize the distinction between categories. However,
Florence, Italy
in everyday life, an individuals face is not perceived in isolation, but almost always appears
*Correspondence:
within a situational context, which may arise from other people, the physical environment
Matthias J. Wieser , Department of
Psychology, Biological Psychology, surrounding the face, as well as multichannel information from the sender. Furthermore,
Clinical Psychology, and situational context may be provided by the perceiver, including already present social infor-
Psychotherapy, University of mation gained from affective learning and implicit processing biases such as race bias.Thus,
Wrzburg, Marcusstr. 9-11, D-97070
the perception of facial expressions is presumably always influenced by contextual vari-
Wrzburg, Germany.
e-mail: wieser@ ables. In this comprehensive review, we aim at (1) systematizing the contextual variables
psychologie.uni-wuerzburg.de that may influence the perception of facial expressions and (2) summarizing experimental
paradigms and findings that have been used to investigate these influences. The studies
reviewed here demonstrate that perception and neural processing of facial expressions
are substantially modified by contextual information, including verbal, visual, and auditory
information presented together with the face as well as knowledge or processing biases
already present in the observer. These findings further challenge the assumption of auto-
matic, hardwired categorical emotion extraction mechanisms predicted by basic emotion
theories. Taking into account a recent model on face processing, we discuss where and
when these different contextual influences may take place, thus outlining potential avenues
in future research.
Keywords: facial expression, face perception, emotion, context, basic emotion

INTRODUCTION With respect to the underlying neural mechanisms, cognitive


Dating back to Darwin (1872) it has been proposed that emotions neuroscience research has revealed that emotional face process-
are universal biological states that are accompanied by distinct ing engages a widely distributed network of brain areas (Haxby
facial expressions (Ekman, 1992). This basic emotion approach et al., 2000; Haxby and Gobbini, 2011). Core brain regions of face
assumes that the expressions of emotion in a face and their percep- processing are located in inferior occipital gyrus (occipital face
tion are unique, natural, and intrinsic phenomena (Smith et al., area, OFA, Puce et al., 1996), lateral fusiform gyrus (fusiform face
2005) that are reliable markers of emotions, co-vary with the sub- area, FFA, Kanwisher et al., 1997), and posterior superior tem-
jective experience, belong to a whole set of emotional responses, poral sulcus (pSTS, Hoffman and Haxby, 2000). According to an
are readily judged as discrete categories, and as such are essential influential model of face perception, incoming visual information
for successful and efficient social interaction (Matsumoto et al., is first encoded structurally, based on the immediate perceptual
2008). input, and then transformed into a more abstract, perspective-
As a consequence, experimental work on the perception of independent model of the face that can be compared to other
emotional facial expressions has often relied on a set of iso- faces in memory (Bruce and Young, 1986). The structural encod-
lated, de-contextualized, static photographs of actors posing facial ing phase has been linked to computations in OFA, whereas the
expressions that maximize the distinction between categories (Bar- more abstract, identity-based encoding occurs in FFA (Rotshtein
rett et al., 2011) and give researchers substantial control of the et al., 2005). The pSTS has been linked to the processing of
duration, appearance, and physical properties of the stimulus. This dynamic information about faces including social signals such as
work has demonstrated that humans generally are able to identify eye gaze and emotional expression (Haxby and Gobbini, 2011).
facial expressions with high accuracy when these are presented Other regions involved in face processing are amygdala and insula,
as singletons (Matsumoto, 2001; Elfenbein and Ambady, 2002). implicated in the processing of emotion and facial expressions,

www.frontiersin.org November 2012 | Volume 3 | Article 471 | 1


Wieser and Brosch Faces in context a comprehensive review

regions of the reward circuitry such as caudate and orbitofrontal averted gaze (Adams and Kleck, 2003, 2005). In a series of stud-
cortex, implicated in the processing of facial beauty and sexual rel- ies it has been shown that the categorization of happy and sad
evance, and inferior frontal gyrus, implicated in semantic aspects expressions as well as angry and fearful expressions was impaired
of face processing and medial prefrontal cortex, implicated in the- when eye gaze was averted, in comparison to direct gaze condi-
ory of mind operations when viewing familiar faces of relatives tions (Bindemann et al., 2008). However, other studies suggest
and friends (Haxby et al., 2000; Ishai, 2008). an interaction of gaze direction and facial expressions, in that
As mentioned above, the neurocognitive model of facial expres- angry faces appear more intense and are more easily recognized
sion processing has been developed based on experiments using when paired with a direct gaze, whereas the opposite seems to
single, context-less faces, a situation rather artificial and low in be true for fearful faces (Adams et al., 2003, 2006; Adams and
ecological validity1 . Outside the laboratory, however, faces rarely Kleck, 2005; Sander et al., 2007; Adams and Franklin, 2009; Ben-
are perceived as single entities and most likely appear within a sit- ton, 2010; Ewbank et al., 2010). This has been explained in the
uational context, which may have a strong impact on how they are context of appraisal theory of emotion, which assumes that gaze
perceived. and expression are always integrated by the observer during an
In this article, we review recent studies in which contextual appraisal of stimulus relevance (for a recent review, Graham and
influences on facial expression perception were investigated at Labar, 2012). In line with this assumption, angry faces are eval-
the level of subjective perception and at the level of the under- uated as being angrier when showing direct gaze, as eye contact
lying neural mechanisms. According to Brunswiks functional lens implies a potential threat in form of imminent attack of the sender,
model of perception (Brunswik, 1956; see also Scherer, 2003), the while fearful faces are perceived as more fearful when showing
transmission of a percept (such as an emotional expression) can averted gaze, as this might indicate potential threat in the envi-
be broken down into different stages, including the encoding of ronment (Adams and Kleck, 2003; Sander et al., 2007). These
a state into distal cues by the sender, the actual transmission of behavioral results were replicated by NDiaye et al. (2009) who
the cues, and the perception of the proximal cues by the receiver. additionally observed increased activity in the amygdala as well as
Accordingly, we will structure our review following this sequence, in fusiform and medial prefrontal areas to angry faces with direct
beginning with contextual effects occurring mainly during the gaze and fearful faces with averted gaze. This effect was found to
encoding of the expression, followed by effects occurring during be absent or considerably reduced in patients with right and left
the transmission from sender to receiver, and finally looking at amygdala damage (Cristinzio et al., 2010). A recent study showed
effects occurring mainly during the decoding/perception by the that amygdala responses to rapidly presented fear expressions are
receiver. We additionally introduce a distinction between distal preferentially tuned to averted gaze, whereas more sustained pre-
context effects coming from the face of the sender and effects sentations lead to preferential responses to fearful expressions with
transmitted by other channels of the sender, which results in the direct gaze (Adams et al., 2012a). This is in line with findings show-
following four types of contextual influences on face perception: ing that gaze direction modulates expression processing only when
(1) within-face features (i.e., characteristics that occur in the same the facial emotion is difficult to discriminate (Graham and LaBar,
face as the core facial expression such as eye gaze and facial 2007). Moreover, it has been demonstrated that when gaze direc-
dynamics), (2) within-sender features (i.e., concurrent context tion is rendered ambiguous and embedded in explicit, contextual
cues from the sender such as affective prosody and body posture), information about intentions of angry and fearful faces, a simi-
(3) external features from the environment surrounding the face lar pattern of amygdala activation is observed as in prior results
(i.e., concurrent multisensory cues originating outside the sender to non-ambiguous gaze cues: angry faces, which were contextu-
such as visual scene, other faces, social situations), and (4) within- alized by explicit information (The people bumped into you by
perceiver features (i.e., affective learning processes and implicit accident. You are in a bad mood today and so you start to insult
processing biases in the perceiver). Following this systematization them. Thereupon, the people become very angry at/afraid of you)
of different types of context, we review and summarize experi- targeting the observer elicited stronger amygdala responses than
mental findings from studies where these four types of features angry faces targeting another person, whereas the opposite pattern
served as context cues for the perception of facial expressions (see was observed for fearful faces (Boll et al., 2011). By showing that
Figure 1). contextual information interacts with facial expression in the same
manner as gaze direction, this study underlines the significance
and meaning of certain gaze directions for human observers. Alto-
CONTEXTUAL INFLUENCES ON FACE RECOGNITION AND
gether, gaze interacts with facial expression, but this also depends
PERCEPTION
on the relative timing and the nature of the stimuli used. The
WITHIN-FACE FEATURES
neural substrates seem to mainly involve the STS and the amyg-
Eye gaze
dala, suggesting that the amygdala is involved in processes going
Within the face, eye gaze is probably the most powerful contex-
beyond basic emotion recognition or arousal processing, as inte-
tual cue. For example, expressions of joy and anger appear to be
gral part of an appraisal system that is sensitive to expression and
considerably more intense when combined with direct than with
gaze direction, among other features (Graham and Labar, 2012).

1 Moreover, the stimuli usually consist of prototypical expressions posed by models,


Facial dynamics
sometimes even actors. As such, they are hardly representative, but may sometimes The temporal dynamics of facial movements are a further con-
appear exaggerated and caricatural (Scherer, 1992). textual cue expressed in the face. Particularly the time course of

Frontiers in Psychology | Cognitive Science November 2012 | Volume 3 | Article 471 | 2


Wieser and Brosch Faces in context a comprehensive review

FIGURE 1 | Examples of the research paradigms established to (after blurred baseline condition). Adapted with permission from Mller
investigate contextual influences on affective face processing along et al., 2011. (E) Visual affective background picture (negative, threat)
the nomenclature of within-face, within-sender, external, and presented together with facial expression (fear). Adapted with permission
within-perceiver features: (A) Eye gaze: direct versus averted eye gaze from Righart and de Gelder, 2008a. (F) Preceding verbal description
combined with angry, fearful, and neutral facial expressions. Reproduced (negative) used as situational context for neutral face. Adapted with
with permission from Ewbank et al., 2010. (B) Facial dynamics: time permission from Schwarz et al., 2012. (G) Task-irrelevant context faces
course (start and end frame of video clips) of facial expressions (happy presented together with target face (center of the screen). Adapted with
versus angry), which were used to investigate onset versus offset of permission from Mumenthaler and Sander, 2012. (H) CS + trial (with
facial expressions. Reproduced with permission from Mhlberger et al., social UCS consisting of verbal insult) used in a social conditioning
2011. (C) Compound facial expressions + body postures (congruent and paradigm. Reproduced with permission from Davis et al., 2010. (I) Self
incongruent, anger and fear is expressed in faces and body postures). race (here White) versus other race (here Black) faces as typically used in
Reproduced with permission from Meeren et al., 2005. (D) Concurrent studies on implicit racial bias. Adapted with permission from Lucas et al.,
affective prosody presented simultaneously with (happy) facial expression 2011.

facial movements when expressing an emotion as well as whether brain activation patterns have been found in the parahippocampal
an expression starts developing (i.e., a face turns from neutral to gyrus including the amygdala, fusiform gyrus, superior temporal
angry) or ends (i.e., a face turns neutral from angry) may constitute gyrus, inferior frontal gyrus, and occipital and orbitofrontal cortex
important non-emotional context cues. It has been demonstrated (Trautmann et al., 2009). Moreover, dynamic compared to static
several times that dynamically evolving facial expressions are much facial expressions are associated with enhanced emotion discrim-
better recognized, rated as more arousing, elicit larger emotion- ination at early and late stages of electro-cortical face processing
congruent facial muscle responses (Weyers et al., 2006; Sato and (Recio et al., 2011). Comparing onset and offset of facial expres-
Yoshikawa, 2007a,b), and also elicit stronger amygdala activity sions, it has been shown that perceived valence and threat of angry
compared to static faces (LaBar et al., 2003; Sato et al., 2004). and happy facial expressions depend on their dynamics in terms
Comparing dynamic to static faces, enhanced emotion-specific of on-versus offset of the respective facial expression. While the

www.frontiersin.org November 2012 | Volume 3 | Article 471 | 3


Wieser and Brosch Faces in context a comprehensive review

onset of happy facial expressions was rated as highly positive and Body postures
not threatening, the onset of angry facial expressions was rated as The most obvious within-sender context for a face is the body
highly negative and highly threatening. Furthermore, the offset of it belongs to, which also underlies non-verbal communication
angry compared to the onset of angry facial expressions was asso- via postures and other body language (e.g., gestures). Meeren
ciated with activity in reward-related brain areas, whereas onset of et al. (2005) found that observers judging a facial expression are
angry as well as offset of happy facial expressions were associated strongly influenced by the concomitant emotional body language.
with activations in threat-related brain areas (Mhlberger et al., In this study, pictures of fearful and angry faces and bodies were
2011). used to create face-body compound images, with either congru-
ent or incongruent emotional expressions. When face and body
WITHIN-SENDER FEATURES conveyed conflicting emotional information, the recognition of
Affective prosody the facial expression was attenuated. Also, the electro-cortical P1
Besides concurrent visual information, acoustic information from component was enhanced in response to congruent face-body
the sender may also serve as context for the perception, recogni- compounds, which points to the existence of a rapid neural mech-
tion, and evaluation of facial expressions. Indeed, the identification anism assessing the degree of agreement between simultaneously
of the emotion in a face is biased in the direction of simultaneously presented facial and bodily emotional expressions already after
presented affective prosody (de Gelder and Vroomen, 2000). Fur- 100 ms. Van den Stock et al. (2007) used facial expressions mor-
thermore, it was demonstrated that this effect occurred even under phed on a continuum between happy and fearful, and combined
instructions to base the judgments exclusively on information in these with happy or fearful body expressions. Ratings of facial
the face. Increased accuracy rates and shorter response latencies in expressions were influenced by body context (congruency) with
emotion recognition tasks when facial expressions are combined the highest impact for the most ambiguous facial expressions. In
with affective prosody have been found in several other studies another study, it was shown that identical facial expressions con-
(de Gelder et al., 1999; Dolan et al., 2001). In one of the first vey strikingly different emotions depending on the affective body
neuroimaging studies on face and affective prosody processing, postures in which they were embedded (Aviezer et al., 2008). More-
Dolan et al. (2001) contrasted emotional congruent to emotional over, it was shown that eye movements, i.e., characteristic fixation
incongruent conditions in an audiovisual paradigm and found patterns previously thought to be determined solely by the facial
greater activation of the left amygdala and right fusiform gyrus in expression, were systematically modulated by this emotional con-
congruent as compared to incongruent conditions. Further fMRI text. These effects were even observed when participants were told
studies on the audiovisual integration of non-verbal emotional to avoid using the context or were led to believe that the context
information revealed that this perceptual gain during audiovisual was irrelevant (Aviezer et al., 2011). In addition, these effects were
compared to visual stimulation alone is accompanied by enhanced not influenced by working memory load. Overall, these results
BOLD responses in pSTG, left middle temporal gyrus and thala- suggest that facial expressions and their body contexts are inte-
mus when comparing the bi-modal condition to either unimodal grated automatically, with modulating effects on perception in
condition (Pourtois et al., 2005; Ethofer et al., 2006; Kreifelts et al., both directions.
2007). In a recent neuroimaging study it has been found that sub-
jects rated fearful and neutral faces as being more fearful when EXTERNAL FEATURES
accompanied by sounds of human screams than compared to Emotion labels
neutral sounds (Mller et al., 2011). Moreover, the imaging data Emotion research frequently uses language in form of emotion
revealed that an incongruence of emotional valence between faces labels, both to instruct participants and to assess recognition per-
and sounds led to increased activation in areas involved in con- formance. As the following examples demonstrate, language can
flict monitoring such as the middle cingulate cortex and the right guide and even bias participants on how to read facial expres-
superior frontal cortex. Further analyses showed that, independent sions in this kind of research. For example, when participants
of emotional valence congruency, the left amygdala was consis- were asked to repeat an emotion word such as anger aloud either
tently activated when the information from both modalities was three times (temporarily increasing its accessibility) or 30 times
emotional. If a neutral stimulus was present in one modality and (temporarily reducing its accessibility), reduced accessibility of
emotional in the other, activation in the left amygdala was signif- the meaning of the word led to slower and less accurate emotion
icantly attenuated compared to when an emotional stimulus was recognition, even when participants were not required to verbally
present in both modalities. This result points at additive effects label the target faces (Lindquist et al., 2006). In a similar vein,
in the amygdala when emotional information is present in both morphed faces depicting an equal blend of happiness and anger
modalities. This congruency effect was also recently confirmed in were found to be perceived as angrier when those faces were paired
a study where laughter as played via headphones increased the per- with the word angry (Halberstadt and Niedenthal, 2001). It has
ceived intensity of a concurrently shown happy facial expression furthermore been demonstrated that verbalizing words disrupts
(Sherman et al., 2012). Taken together, results from audiovisual the ability to make correct perceptual judgments about faces, pre-
paradigms point at massive influences of auditory context cues in sumably because it interferes with access to language necessary
terms of affective prosody on the processing of facial expressions. for judgment (Roberson and Davidoff, 2000). In line with the lat-
Again, this is reflected by congruency effects with better recogni- ter findings, at the neural level it was demonstrated that emotional
tion rates and larger activations in face processing and emotion labeling of negative emotional faces produced a decreased response
processing areas. in limbic brain areas such as the amygdala, which might reflect that

Frontiers in Psychology | Cognitive Science November 2012 | Volume 3 | Article 471 | 4


Wieser and Brosch Faces in context a comprehensive review

putting feeling into words has regulatory effects at least for nega- in a centrally presented face when an angry peripheral face gazed
tive emotions (Lieberman et al., 2007). Overall, these findings are at the central face. In addition to emotion recognition of the tar-
in line with the language-as-a-context-hypothesis (Barrett et al., get, emotional expressions of context faces may also be used to
2007), which proposes that language actively influences emotional inform the explicit social evaluations of the observer: male faces
perception by dynamically guiding the perceivers processing of that were looked at by females with smiling faces were rated as
structural information from the face. more attractive by female participants than males looked at with
neutral expressions. Revealing an interesting gender difference, in
Verbal descriptions of social situations the same experiment male participants preferred the male faces
Verbal descriptions of social situations also provide strong con- that were being looked at by female faces with neutral expressions
textual cues which influence facial expression perception. Carroll (Jones et al., 2007).
and Russell (1996) let participants read six short stories of sit-
uations which created a context, afterward a facial expressions Visual scene
was shown and had to be rated. Indeed, in each of the 22 cases Faces usually are perceived together with other visual stimuli,
that were examined, contextual information overwrote the facial for example the surrounding visual scene which might be con-
information (e.g., a person in a painful situation but displaying stituted by non-animated visual objects. Barrett and Kensinger
fear was judged as being in pain). These results are in line with an (2010) found that visual context is routinely encoded when facial
earlier study, where verbal descriptions of emotion-eliciting events expressions are observed. Participants remembered the context
were used as situational cues, which increased emotion recognition more often when asked to label an emotion in a facial expres-
in faces (Trope, 1986). Using neuroimaging it was demonstrated sion than when asked to judge the expressions simple affective
that brain responses to ambiguous emotional faces (surprise) are significance (which can be done on the basis of the structural fea-
modified by verbal descriptions of context conditions: stronger tures of the face alone). The authors conclude that the structural
amygdala activation for surprised faces embedded in negative features of the face, when viewed in isolation, might be insuffi-
compared to positive contexts was found, thus demonstrating cient for perceiving emotion. In a series of studies, Righart and
context-dependent neural processing of the very same emotional de Gelder, 2006, 2008a,b) examined how the surrounding visual
face (Kim et al., 2004). This effect has been recently extended to context affects facial expression recognition and its neural process-
neutral faces, where context conditions of self-reference modu- ing. Using event-related brain potentials to faces (fearful/neutral)
lated perception and evaluation of neutral faces in terms of larger embedded in visual scene contexts (fearful/neutral) while par-
mPFC and fusiform gyrus activity to self- versus other-related ticipants performed an orientation-decision task (face upright
neutral faces and more positive and negative ratings of the neutral or inverted), they found that the presence of a face in a fear-
faces when put in positive and negative contexts (Schwarz et al., ful context was associated with enhanced N170 amplitudes, and
2012). The latter findings demonstrate that contextual influences this effect was strongest for fearful faces on left-occipito-temporal
might be most powerful when the information about the emotion sites (Righart and de Gelder, 2006). Interestingly, faces without
from facial features is absent or ambiguous at best. any context showed the largest N170 amplitudes, indicating com-
petition effects between scenes and faces. In a follow-up study,
Other faces participants had to categorize facial expressions (disgust, fear,
Probably the most frequent external context cue for a face is happiness) embedded in visual scenes with either congruent or
another face, as we often perceive persons surrounded by persons. incongruent emotional content (Righart and de Gelder, 2008b).
Not surprisingly, it was demonstrated that facial expressions have A clear congruency effect was found such that categorization of
a strong influence on the perception of other facial expressions. facial expressions was speeded up by congruent scenes, and this
Russell and Fehr (1987) observed that the read-out of an emotion advantage was not impaired by increasing task load. In another
from a facial expression clearly depends on previously encountered study, they investigated how the early stages of face processing are
facial expressions: a first expression (the anchor) displaced the affected by emotional scenes when explicit categorizations of fear-
judgment of subsequent facial expression, for instance, a neutral ful and happy facial expressions are made (Righart and de Gelder,
target face was categorized as sadder after a happy face was seen. In 2008a). Again, emotion effects were found with larger N170 ampli-
a series of three experiments it was shown that the implicit contex- tudes for faces in fearful scenes as compared to faces in happy
tual information consisting of other facial expressions modulates and neutral scenes. Critically, N170 amplitudes were significantly
valence assessments of surprised faces, such that they were inter- increased for fearful faces in fearful scenes as compared to fear-
preted as congruent with the valence of the contextual expressions ful faces in happy scenes and expressed in left-occipito-temporal
(Neta et al., 2011). Recently, it was also demonstrated that congru- scalp topography differences. Using videos as visual context, it
ent, but irrelevant faces in the periphery enhance recognition of was also demonstrated that both positive and negative contexts
target faces, whereas incongruent distracter faces reduced recogni- resulted in significantly different ratings of faces compared with
tion of target faces (Mumenthaler and Sander, 2012). This contex- those presented in neutral contexts (Mobbs et al., 2006). These
tual effect of concurrent faces was augmented when the peripheral effects were accompanied by alterations in several brain regions
face gazed at the target face, indicating social appraisal, where the including the bilateral temporal pole, STS insula, and ACC. More-
appraisal of another person is integrated into the appraisal of the over, an interaction was observed in the right amygdala when
observer, thus facilitating emotion recognition. Social appraisal subtle happy and fear faces were juxtaposed with positive and neg-
was furthermore demonstrated by facilitated recognition of fear ative movies, respectively, which again points at additive effects of

www.frontiersin.org November 2012 | Volume 3 | Article 471 | 5


Wieser and Brosch Faces in context a comprehensive review

context and facial expression. Together, these series of experiments or neutral gossip (and were then presented alone in a binocular
clearly indicate that the information provided by the facial expres- rivalry paradigm (faces were presented to one eye, houses to the
sion is automatically combined with the scene context during face other), only the faces previously paired with negative (but not pos-
processing. Mostly, congruency effects were observed such that itive or neutral) gossip dominated longer in visual consciousness
congruent visual contexts helped identifying facial expressions and (Anderson et al., 2011). These findings also demonstrate that social
led to larger N170 amplitudes2 and alterations in face processing affective learning can influence vision in a completely top-down
areas in the brain. manner, independent of the basic structural features of a face. It is
important to note that the contextual influences described here are
WITHIN-PERCEIVER FEATURES based on previous encounters, the contextual information itself is
Social affective learning mechanisms and recognition memory not present at the time the face is seen again. All of these findings,
When we process a face, we compare it to previously encoded however, were obtained with neutral faces only and have to be
memory representations. Affective information stemming from extended to emotional facial expressions as well.
previous encounters may thus guide our perception and evalu- In addition to the conditioning literature reviewed above,
ation. Affective or social conditioning studies have investigated recognition memory studies employing old/new paradigms have
this effect by pairing neutral faces with social cues (e.g., affective also revealed massive context effects during encoding on recog-
sounds, sentences) serving as unconditioned stimulus (UCS). For nition memory. It has been shown, for example, that the N170
example, in one study, participants learned that faces predicted amplitude during recognition is diminished when the face was
negative social outcomes (i.e., insults), positive social outcomes presented in a contextual frame compared to no contextual frame
(i.e., compliments), or neutral social outcomes (Davis et al., 2010). during the encoding phase, but heightened when the contex-
Afterward, participants reported liking or disliking the faces in tual frame was of positive compared to negative valence (Galli
accordance with their learned social value. During acquisition, dif- et al., 2006). Similar effects have been observed when fearful faces
ferential activation across the amygdaloid complex was observed. were shown at encoding, but neutral faces of the same identity
A region of the medial ventral amygdala and a region of the dorsal at retrieval (Righi et al., 2012), which has been also shown in a
amygdala/substantia innominata showed signal increases to both recent fMRI study, where several brain regions involved in famil-
negative and positive faces, whereas a lateral ventral region dis- iar face recognition, including fusiform gyrus, posterior cingulate
played a linear representation of the valence of faces such that gyrus, and amygdala, plus additional areas involved in motiva-
activations in response to negatively associated faces were larger tional control such as caudate and anterior cingulate cortex, were
than those to positive ones, which in turn were larger compared to differentially modulated as a function of a previous encounter
those elicited by faces associated with neutral sentences. In another (Vrticka et al., 2009). Also, attractiveness during encoding was
social conditioning paradigm, Iidaka et al. (2010) found that a found to alter ERP responses during retrieval (Marzi and Vig-
neutral face could be negatively conditioned by using a voice with giano, 2010). Additionally, faces associated earlier with positively
negative emotional valence (male voice loudly saying Stupid!). or negatively valenced behaviors elicited stronger activity in brain
Successful conditioning was indicated by elevated skin conduc- areas associated with social cognition such as paracingulate cortex
tance responses as well as greater amygdala activation, demonstrat- and STS (Todorov et al., 2007). However, a recent study investigat-
ing that the perceptually neutral face elicited different behavioral ing the effect of concurrent visual context during encoding of faces
and neural responses after affective learning. Moreover, Morel et al. showed that facial identity was less well recognized when the face
(2012) showed in a MEG study that even faces previously paired was seen with an emotional body expression or against an emo-
only once with negative or positive contextual information, are tional background scene, compared to a neutral body or a neutral
rapidly processed differently in the brain, already between 30 and background scene. Most likely, this is due to orienting responses
60 ms post-face onset. More precisely, the faces previously seen in triggered by the visual context, which may lead to a less elaborate
a positive (happy) emotional context evoked a dissociated neural processing and in turn resulting in a decreased facial recognition
response as compared to those previously seen in either a negative memory (Van den Stock and de Gelder, 2012). In general, context
(angry) or a neutral context. Source localization revealed two main cues present at the encoding phase have a great impact on how
brain regions involved in this very early effect: the bilateral ven- faces are remembered.
tral, occipito-temporal, extrastriate regions and the right anterior
medial temporal regions. A recent study showed in two experi- Implicit race bias
ments that neutral faces which were paired with negative, positive, An interesting example for an interaction of within-face and
within-perceiver contextual features is illustrated by research
2
investigating the impact of race bias and stereotypes on the per-
The N170 reflects the structural encoding of faces (Bentin et al., 1996). Results on
the affective modulation of the N170 are mixed so far: several studies exist which
ception of faces. Effects of implicit race bias on face perception
point at no modulation of the N170 components by facial expressions (e.g., Krolak- have been demonstrated both at the neural level and at the per-
Salmon et al., 2001; Eimer and Holmes, 2002; Schupp et al., 2004; Wieser et al., ceptual level. For example, participants with high implicit race bias
2012a), whereas others reported differences in the amplitude of the N170 between (measured using the implicit association test, IAT) showed higher
emotional and neutral faces (Batty and Taylor, 2003; Blau et al., 2007; Mhlberger increases in amygdala activation toward black faces compared to
et al., 2009; Wieser et al., 2010; Wronka and Walentowska, 2011). Most likely, dif-
ferences are due to methodological differences between paradigms. Recently, it has
participants with lower bias (Phelps et al., 2000; Cunningham
been suggested that modulations of the N170 are observed only when participants et al., 2004). Similarly, multi-voxel pattern decoding of the race
attend to the facial expressions (Wronka and Walentowska, 2011). of a perceived face based on BOLD signal in FFA has been shown

Frontiers in Psychology | Cognitive Science November 2012 | Volume 3 | Article 471 | 6


Wieser and Brosch Faces in context a comprehensive review

to be restricted to participants with high race bias (Brosch et al., in the serotonin receptor gene HTR3A (Iidaka et al., 2005), the
in press), suggesting that race bias may decrease the similarity of neuropeptide B7W receptor-1 gene (Watanabe et al., 2012), and
high-level representations of black and white faces. With regard to serotonin transporter (5-HTT) promoter gene (Hariri et al., 2002).
the face-sensitive N170 component of the ERP, it has been found Noteworthy, interactions of personality traits and contextual influ-
that pro-white bias was associated with larger N170 responses to ences are also likely to occur. A recent study demonstrated that
black versus white faces, which may indicate that people with the way how visual contextual information affected the percep-
stronger in group preferences may see out-group faces as less tion of facial expressions depended on the observers tendency
normative and, thus, require greater engagement of early facial to process positive or negative information as assessed with the
encoding processes (Ofan et al., 2011). At the behavioral level, BIS/BAS questionnaire (Lee et al., 2012).
the mental templates of out-group faces possess less trustwor-
thy features in participants with high implicit bias (Dotsch et al., SUMMARY AND CONCLUSION
2008). Race bias furthermore has been shown to interact with The empirical findings reviewed in this paper clearly demon-
the perception of emotion in a face: in a change detection task, strate that perception and evaluation of faces are influenced by the
participants observed a black or white face that slowly morphed context in which these expressions occur. Basically, affective and
from one expression to another (either from anger to happiness social information gained from either within-face features (eye
or from happiness to anger) and had to indicate the offset of the gaze, expression dynamics), within-sender features (body posture,
first expression. Higher implicit bias was associated with a greater prosody, affective learning), environmental features (visual scene,
readiness to perceive anger in Black as compared with White faces, other faces, verbal descriptions of social situations), or within-
as indicated by a later perceived offset (or earlier onset) of the anger perceiver features (affective learning, cognitive biases, personality
expressions (Hugenberg and Bodenhausen, 2004). Similarly, posi- traits) seems to dramatically influence how we perceive a facial
tive expressions were recognized faster in White as compared with expression. The studies as reviewed above altogether point at the
Black faces (Hugenberg, 2005). notion that efficient emotion perception is not solely driven by the
structural features present in a face. In addition, as the studies on
Personality traits affective learning and verbal descriptions of contexts show, con-
Obviously, the personality traits of the perceiver are important textual influences seem to be especially influential when the facial
when discussing to within-perceiver features. As this vast literature expression is either ambiguous (e.g., surprised faces) or no facial
is worth a review in its own, we only briefly summarize findings expression is shown (i.e., neutral faces). The latter effects not only
on some personality dimensions here (for extensive reviews, see highlight the notion of contextual influences but also pinpoint
Calder et al., 2011; Fox and Zougkou, 2011). Extraversion, for the issues of using so-called neutral faces as baseline condition in
example, seems to be associated with prioritized processing of pos- the research of facial expressions perception. Thus, using neutral
itive facial expressions on neural (higher amygdala activation) and faces as baseline, researchers have to be aware that these are prob-
behavioral level (Canli et al., 2002), whereas higher levels of trait- ably influenced by the preceding faces (Russell and Fehr, 1987),
anxiety and neuroticism are associated with stronger reactions to which could be resolved or at least attenuated using longer inter-
negative facial expressions, as suggested by converging evidence trial intervals, for example. Moreover, as contextual impact may be
from different paradigms. For example, it has been shown that particularly high for neutral faces, one has to try to minimize con-
the left amygdala of participants reporting higher levels of state- textual information when mere comparisons of neutral and other
anxiety reacts more strongly to facial expressions of fear (Bishop expressions are the main interest. In line with this viewpoint, it
et al., 2004b). Moreover, a lower recruitment of brain networks has been recently demonstrated that emotion-resembling features
involved in attentional control including the dorsolateral and ven- of neutral faces can drive their evaluation, likely due to overgen-
trolateral prefrontal cortex (dlPFC, vlPFC) was observed in par- eralizations of highly adaptive perceptual processes by structural
ticipants reporting increased trait-anxiety (Bishop et al., 2004a). resemblance to emotional expressions (Adams et al., 2012b). This
Another study found strong associations between amygdala activ- assumption is supported by a study that used an objective emo-
ity in response to backward masked fearful faces and high levels tion classifier to demonstrate the relationship between a set of trait
of self-reported trait-anxiety (Etkin et al., 2004). Alterations in inferences and subtle resemblance to emotions in neutral faces
facial expressions processing are not surprisingly most evident (Said et al., 2009). Although people can categorize faces as emo-
in traits related to disorders with social dysfunctions like social tionally neutral, they also vary considerably on their evaluations
phobia/anxiety (e.g., Wieser et al., 2010, 2011, 2012b; McTeague with regard to trait dimensions such as trustworthiness (Engell
et al., 2011) and autism-spectrum disorders (for a review, see et al., 2007). A possible explanation is that neutral faces may
Harms et al., 2010). Whereas in social anxiety biases and elevated contain structural properties that cause them to resemble faces
neural responses to threatening faces (i.e., angry faces) are most with more accurate and ecologically relevant information such as
commonly observed (Miskovic and Schmidt, 2012), person with emotional expressions (Montepare and Dobish, 2003). Another
autism-spectrum disorders are often less able to recognize emo- finding suggests that prototypical neutral faces may be evaluated
tion in a face (e.g., Golan et al., 2006). Furthermore, people high as negative in some circumstances, which suggests that the inclu-
on the autistic spectrum seem to process faces more feature-based sion of neutral faces as a baseline condition might introduce an
when asked to identify facial expressions (Ashwin et al., 2006). experimental confound (Lee et al., 2008).
In addition, emerging evidence suggests that processing of facial A large amount of the effects of context on facial expression
expressions is also depending on genetic factors such as variations processing seems to rely on congruency effects (i.e., facilitation

www.frontiersin.org November 2012 | Volume 3 | Article 471 | 7


Wieser and Brosch Faces in context a comprehensive review

of emotion perception when context information is congruent). context-dependency is assumed to reflect adaption processes of
Whereas some of the effects observed here can be explained humans to their ecological niche (Ambady and Weisbuch, 2011).
by affective priming mechanisms (for example, when previously With regard to current neurocognitive models of face process-
given verbal context influences the perception of the face pre- ing, we suggest that the model of distributed neural systems for
sented afterward), other effects (for example, when concurrently face perception (Haxby and Gobbini, 2011) offers an avenue for
available information from the auditory channels influences the integrating the findings of different sources of context as reviewed
perception of facial expressions) point to a supramodal emotion above, as it allows for multiple entry points of context into ongoing
perception which rapidly integrates cues from the facial expression face processing (see Figure 2).
and affective contextual cues. Particularly the influences on sur- At different levels of face processing, within-face, within-
prised (ambiguous) and neutral faces show that contextual effects sender, within-perceiver, and external features have been shown
may play an even more important role when the emotional infor- to strongly modulate the neurocognitive mechanisms involved in
mation is difficult to derive from the facial features alone. The face perception. The neural substrates of these modulations mainly
latter is also in line with recent models of impression forming of concern brain areas within the extended network of the model (see
other people based on minimal information (Todorov, 2011). Figure 2), but also some additional areas related to self-referential
Beyond the scope of this paper, cultural influences obviously processing, impression formation, and affective learning. Context
also define a major context for the perception of facial expressions variables like within-perceiver features as learning mechanisms
(Viggiano and Marzi, 2010; Jack et al., 2012). In brief, facial expres- and personality traits influence areas involved in person knowledge
sions are most reliably identified by persons who share the same and emotion processes (medial PFC, temporo-parietal junction,
cultural background with senders (Ambady and Weisbuch, 2011), limbic system), which are also modulated by within-sender fea-
whereas, for instance, Eastern compared to Western observers use tures like affective prosody and body posture. External features
a culture-specific decoding strategy that is inadequate to reliably most likely modulate face processing via emotional circuits (amyg-
distinguish facial expressions of fear and disgust (Jack et al., 2009). dala, insula, striatum), which through feedforward and feedback
Only recently it has also been shown that concurrent pain alters loops especially between the amygdala, and primary visual cortex,
the processing particularly of happy facial expressions (Gerdes OFA, FFA, and STS (Carmichael and Price, 1995; Morris et al.,
et al., 2012). The latter findings point at the notion that within- 1998; Iidaka et al., 2001; Amaral et al., 2003; Catani et al., 2003;
perceiver features also include alternations within the (central) Vuilleumier et al., 2004; Price, 2007), may foster the interaction
nervous system of the perceiver. between facial expressions and affective context and lead to a
Altogether, the studies reviewed above demonstrate that faces unified and conscious percept (Tamietto and de Gelder, 2010).
do not speak for themselves (Barrett et al., 2011), but are always, Activations in the extended network in turn influence process-
known or unbeknown to the perceiver, subject to contextual mod- ing in the core face network via top-down modulatory feedback
ulations lying within-face, within-sender, within-perceiver, or in volleys (Haxby and Gobbini, 2011). It is important to note that
the environment. Thus, the assumption of modular basic emotion the influences of context might be seen even in early stages of face
recognition might only hold true for very rare cases, particularly processing taking place in the core system (OFA; FFA; STS). Partic-
when elicited emotions and resulting facial expressions are intense ularly within-face features are key players here, as the STS is clearly
and exaggerated, and there is no reason to modify and manage involved in gaze processing and processing of facial dynamics.
the expression. These assumptions are also in line with the notion Besides the involved structures of the brain (where?), it is
that even simple visual object perception is highly dependent on also essential for our further understanding of contextual influ-
context (Bar, 2004). ences on face processing to identify at which temporal stage of
The importance of context is congruent with several other the processing stream (when?) the integration of context and
theoretical approaches, such as Scherers (1992) assumption that facial expression takes place. To this end, measures with high
facial expressions do not solely represent emotional, but also cog- temporal resolution like EEG are much better suited than mea-
nitive processes (for further discussion, see Brosch et al., 2010). sures with high spatial, but poor temporal resolution like fMRI
In this view, facial expressions do not categorically represent few (Brosch and Wieser, 2011). Indeed, several ERP studies point at
basic emotions, but are the result of sequential and cumula- the notion that the integration of context and facial expression
tive stimulus evaluations (which also take into account context seems to be an automatic and mandatory process, which takes
variables as reviewed above). Thus, facial expressions are sup- place very early in the processing stream, probably even before
posed to simultaneously and dynamically express cognition and the structural encoding is completed. Looking at combinations
emotion. The categorical approach is further challenged by social- of facial expressions and body postures, the P1 component of
ecological approaches, which assume social cognition and percep- the ERP was enhanced when faces were presented together with
tion to proceed in an ecologically adaptive manner (McArthur an incongruent body posture (Meeren et al., 2005), which sug-
and Baron, 1983). Here it is proposed that different modalities gests a rapid neural mechanism of integration, or at least an
are always combined to inform social perception. Hence, identi- early sensitivity for non-matching. This assumption is also sup-
fication of facial expressions is especially fast and accurate when ported by EEG studies on face-voice integration (de Gelder et al.,
other modalities impart the same information. In this line, facial 2006), where it has been found, for instance, that the auditory
expressions also are contingent on the broader context of the face N1 component occurring about 110 ms after presentation of an
(i.e., what we refer to as within-face features) and the unexpres- affective voice is significantly enhanced by a congruent facial
sive features of the face/sender (i.e., within-sender features). This emotion (Pourtois et al., 2000). ERP studies investigating how

Frontiers in Psychology | Cognitive Science November 2012 | Volume 3 | Article 471 | 8


Wieser and Brosch Faces in context a comprehensive review

FIGURE 2 | Modified version of the model of distributed neural transmission phase (environmental features), and proximal
systems for face processing (blue boxes). The red boxes indicate perceptions (within-perceiver features) and their supposed target
the contextual factors as outlined in the review and show distal cues areas within the model. Modified after Haxby et al. (2000), Haxby and
(within-face, within-sender features), influences from the Gobbini (2011).

emotionally positive or negative information at encoding influ- accounts, which includes not only the perceptual processing of
ences later recognition of faces in an old/new task (Galli et al., facial features but also social knowledge concerning context.
2006) or how concurrent visual context affects facial expression As this review shows, different types of context influence face
processing (Righart and de Gelder, 2006) also show effects at early perception. However, a lot of questions remain unsolved, which
encoding stages in form of a contextual modulation of the N170 need to be addressed in future research. It is still unknown, for
amplitude. Early modifications in face-related visual cortices (STS, example, how these various dimensions of contexts and facial
fusiform gyrus) may be due to re-entrant projections from the expressions are coded and how they are integrated into a single
amygdala (e.g., Vuilleumier et al., 2004), crosstalk between sen- representation. Moreover, given the complexity of such a process,
sory areas (bi-modal stimulation) or directly by modulations of the available models need to be refined with regard to the intercon-
STS activity as a hub for multisensory integration (Barraclough nections between the neuroanatomical structures underlying face
et al., 2005). perception as shown in Figure 2. Particularly, the exact time course
Altogether, the literature as reviewed above nicely fits distrib- of these processes is unknown so far, and more data is needed to
uted as opposed to categorical models of face processing (Haxby decide whether this is an automatic and mandatory process (de
et al., 2000; de Gelder et al., 2003), and clearly points at the notion Gelder and Van den Stock, 2011). Furthermore, elucidating the
that the face itself might not tell us the full story about the under- interactions of different types of contexts and their influence on
lying emotions. Moreover, as ERP studies reveal, the integration of facial expression perception is an important research question for
context and facial expression may occur at early stages of stimulus social neuroscience. Especially the interactions of within-perceiver
processing and in an automatic fashion. This network is the neural and within-sender features may be of interest here, as they may
substrate of a much more complex and inferential process of every- enhance our understanding of non-verbal social communication.
day face-reading than previously conceptualized by categorical Additionally, interactions of facial expressions and contexts have so

www.frontiersin.org November 2012 | Volume 3 | Article 471 | 9


Wieser and Brosch Faces in context a comprehensive review

far mainly been investigated in one direction (namely the effect of Taken together, the work reviewed here demonstrates that
the context on the face) at the perception stage. However, it would the impact of contextual cues on facial expression recognition
also be interesting to investigate the impact of facial expressions is remarkable. At the neural level, this interconnection is imple-
on how the surrounding environment is affectively toned. More- mented in widespread interactions between distributed core and
over, the reviewed results and especially the finding of a highly extended face processing regions. As outlined earlier, many ques-
flexible perception of neutral faces suggest that standard para- tions about these interactions are still unsolved. All the more, the
digms for studying facial expression perception may need some time has come for social affective neuroscience research to put
modifications. As a minimal requirement, experimenters in facial faces back in context.
expression research need to consider the intrinsically involved
contextual influences. This is all the more important as exper- ACKNOWLEDGMENTS
iments on language and other faces as contexts show that even This publication was funded by the German Research Founda-
within experimentally sound studies contextual influences cannot tion (DFG) and the University of Wuerzburg in the funding
be precluded. programme Open Access Publishing.

REFERENCES Ashwin, C., Wheelwright, S., and Baron- gaze and facial expression interact? (2002). Amygdala response to happy
Adams, R. B., Ambady, N., Macrae, Cohen, S. (2006). Finding a face in Vis. Cogn. 16, 708733. faces as a function of extraversion.
C. N., and Kleck, R. E. (2006). the crowd: testing the anger supe- Bishop, S. J., Duncan, J., Brett, M., and Science 296, 2191.
Emotional expressions forecast riority effect in Asperger syndrome. Lawrence, A. D. (2004a). Prefrontal Carmichael, S. T., and Price, J. L.
approach-avoidance behavior. Brain Cogn. 61, 7895. cortical function and anxiety: con- (1995). Limbic connections of the
Motiv. Emot. 30, 177186. Aviezer, H., Bentin, S., Dudarev, V., trolling attention to threat-related orbital and medial prefrontal cor-
Adams, R. B., and Franklin, R. G. (2009). and Hassin, R. R. (2011). The auto- stimuli. Nat. Neurosci. 7, 184188. tex in macaque monkeys. J. Comp.
Influence of emotional expression maticity of emotional face-context Bishop, S. J., Duncan, J., and Lawrence, Neurol. 363, 615641.
on the processing of gaze direction. integration. Emotion 11, 14061414. A. D. (2004b). State anxiety modu- Carroll, J. M., and Russell, J. A. (1996).
Motiv. Emot. 33, 106112. Aviezer, H., Hassin, R. R., Ryan, J., Grady, lation of the amygdala response to Do facial expressions signal specific
Adams, R. B., Franklin, R. G., Kver- C. L., Susskind, J., Anderson, A., unattended threat-related stimuli. J. emotions? Judging emotion from
aga, K., Ambady, N., Kleck, R. E., et al. (2008). Angry, disgusted, or Neurosci. 24, 1036410368. the face in context. J. Pers. Soc.
Whalen, P. J., et al. (2012a). Amyg- afraid? Studies on the malleability of Blau, V., Maurer, U., Tottenham, N., Psychol. 70, 205218.
dala responses to averted vs direct emotion perception. Psychol. Sci. 19, and McCandliss, B. (2007). The face- Catani, M., Jones, D. K., Donato, R.,
gaze fear vary as a function of pre- 724732. specific N170 component is modu- and Ffytche, D. H. (2003). Occipito-
sentation speed. Soc. Cogn. Affect. Bar, M. (2004). Visual objects in context. lated by emotional facial expression. temporal connections in the human
Neurosci. 7, 568577. Nat. Rev. Neurosci. 5, 617629. Behav. Brain Funct. 3, 7. brain. Brain 126, 20932107.
Adams, R. B., Nelson, A. J., Soto, J. A., Barraclough, N. E., Xiao, D. K., Baker, Boll, S., Gamer, M., Kalisch, R., and Cristinzio, C., NDiaye, K., Seeck, M.,
Hess, U., and Kleck, R. E. (2012b). C. I., Oram, M. W., and Perrett, D. Buchel, C. (2011). Processing of Vuilleumier, P., and Sander, D.
Emotion in the neutral face: a mech- I. (2005). Integration of visual and facial expressions and their signifi- (2010). Integration of gaze direction
anism for impression formation? auditory information by superior cance for the observer in subregions and facial expression in patients with
Cogn. Emot. 26, 431441. temporal sulcus neurons responsive of the human amygdala. Neuroimage unilateral amygdala damage. Brain
Adams, R. B., Gordon, H. L., Baird, A. A., to the sight of actions. J. Cogn. 56, 299306. 133, 248261.
Ambady, N., and Kleck, R. E. (2003). Neurosci. 17, 377391. Brosch, T., Bar-David, E., and Phelps, Cunningham, W. A., Johnson, M. K.,
Effects of gaze on amygdala sensitiv- Barrett, L. F., and Kensinger, E. A. E. A. (in press). Implicit race bias Raye, C. L., Gatenby, J. C., Gore, J. C.,
ity to anger and fear faces. Science (2010). Context is routinely encoded decreases the similarity of neural and Banaji, M. R. (2004). Separable
300, 1536. during emotion perception. Psychol. representations of black and white neural components in the process-
Adams, R. B., and Kleck, R. E. (2003). Sci. 21, 595599. faces. Psychol. Sci. ing of black and white faces. Psychol.
Perceived gaze direction and the pro- Barrett, L. F., Lindquist, K. A., and Gen- Brosch, T., Pourtois, G., and Sander, D. Sci. 15, 806813.
cessing of facial displays of emotion. dron, M. (2007). Language as con- (2010). The perception and categori- Darwin, C. (1872). The Expression of
Psychol. Sci. 14, 644647. text for the perception of emotion. sation of emotional stimuli: a review. the Emotions in Man and Animals,
Adams, R. B., and Kleck, R. E. (2005). Trends Cogn. Sci. (Regul. Ed.) 11, Cogn. Emot. 24, 377400. 3rd Edn. New York, NY: Oxford
Effects of direct and averted gaze on 327332. Brosch, T., and Wieser, M. J. (2011). University Press.
the perception of facially communi- Barrett, L. F., Mesquita, B., and Gen- The (non) automaticity of amygdala Davis, F. C., Johnstone, T., Mazzulla,
cated emotion. Emotion 5, 311. dron, M. (2011). Context in emotion responses to threat: on the issue of E. C., Oler, J. A., and Whalen, P.
Amaral, D. G., Behniea, H., and Kelly, perception. Curr. Dir. Psychol. Sci. fast signals and slow measures. J. J. (2010). Regional response differ-
J. L. (2003). Topographic organiza- 20, 286290. Neurosci. 31, 14451. ences across the human amygdaloid
tion of projections from the amyg- Batty, M., and Taylor, M. J. (2003). Bruce, V., and Young, A. (1986). Under- complex during social conditioning.
dala to the visual cortex in the Early processing of the six basic standing face recognition. Br. J. Psy- Cereb. Cortex 20, 612.
macaque monkey. Neuroscience 118, facial emotional expressions. Brain chol. 77, 305327. de Gelder, B., Bocker, K. B. E.,
10991120. Res. Cogn. Brain Res. 17, 613620. Brunswik, E. (1956). Perception and the Tuomainen, J., Hensen, M., and
Ambady, N., and Weisbuch, M. (2011). Bentin, S., Allison, T., Puce, A., Perez, E., Representative Design of Psychologi- Vroomen, J. (1999). The combined
On perceiving facial expressions: and McCarthy, G. (1996). Electro- cal Experiments, 2nd Edn. Berkeley: perception of emotion from voice
the role of culture and context, in physiological studies of face percep- University of California Press. and face: early interaction revealed
Oxford Handbook of Face Perception, tion in humans. J. Cogn. Neurosci. 8, Calder, A. J., Ewbank, M., and Passa- by human electric brain responses.
eds A. J. Calder, G. Rhodes, M. H. 551565. monti, L. (2011). Personality influ- Neurosci. Lett. 260, 133136.
Johnson, and J. V. Haxby (New York: Benton, C. P. (2010). Rapid reactions ences the neural responses to view- de Gelder, B., Meeren, H. K., Righart,
Oxford University Press), 479488. to direct and averted facial expres- ing facial expressions of emotion. R., Van den Stock, J., van de Riet, W.
Anderson, E., Siegel, E. H., Bliss- sions of fear and anger. Vis. Cogn. Philos. Trans. R. Soc. Lond. B Biol. A., and Tamietto, M. (2006). Beyond
Moreau, E., and Barrett, L. F. (2011). 18, 12981319. Sci. 366, 16841701. the face: exploring rapid influences
The visual impact of gossip. Science Bindemann, M., Burton, A. M., and Canli, T., Sivers, H., Whitfield, S. L., of context on face processing. Prog.
332, 14461448. Langton, S. R. H. (2008). How do eye Gotlib, I. H., and Gabrieli, J. D. Brain Res. 155, 3748.

Frontiers in Psychology | Cognitive Science November 2012 | Volume 3 | Article 471 | 10


Wieser and Brosch Faces in context a comprehensive review

de Gelder, B., and Van den Stock, J. Galli, G., Feurra, M., and Viggiano, the response latency advantage for as assessed by scalp event related
(2011). Real faces, real emotions: M. P. (2006). Did you see him in happy faces. Emotion 5, 267276. potentials. Eur. J. Neurosci. 13,
perceiving facial expressions in nat- the newspaper? Electrophysiologi- Hugenberg, K., and Bodenhausen, G. V. 987994.
uralistic contexts of voices, bodies cal correlates of context and valence (2004). Ambiguity in social catego- LaBar, K. S., Crupain, M. J., Voyvodic,
and scenes, in The Oxford Hand- in face processing. Brain Res. 1119, rization: the role of prejudice and J. T., and McCarthy, G. (2003).
book of Face Recognition, eds A. J. 190202. facial affect in race categorization. Dynamic perception of facial affect
Calder, G. Rhodes, M. H. Johnson, Gerdes, A., Wieser, M. J., Alpers, G. W., Psychol. Sci. 15, 342345. and identity in the human brain.
and J. V. Haxby (New York: Oxford Strack, F., and Pauli, P. (2012). Why Iidaka, T., Omori, M., Murata, T., Cereb. Cortex 13, 10231033.
University Press), 535550. do you smile at me while Im in Kosaka, H., Yonekura, Y., Okada, Lee, E., Kang, J. I., Park, I. H., Kim, J.-J.,
de Gelder, B., and Vroomen, J. (2000). pain? Pain selectively modulates T., et al. (2001). Neural interac- and An, S. K. (2008). Is a neutral face
The perception of emotions by ear voluntary facial muscle responses to tion of the amygdala with the pre- really evaluated as being emotionally
and by eye. Cogn. Emot. 14, 289311. happy faces. Int. J. Psychophysiol. 85, frontal and temporal cortices in the neutral? Psychiatry Res. 157, 7785.
de Gelder, B. C., Frissen, I., Barton, J., 161167. processing of facial expressions as Lee, T. H., Choi, J. S., and Cho,
and Hadjikhani, N. (2003). A mod- Golan, O., Baron-Cohen, S., and Hill, J. revealed by fMRI. J. Cogn. Neurosci. Y. S. (2012). Context modula-
ulatory role for facial expressions in (2006). The Cambridge mindread- 13, 10351047. tion of facial emotion percep-
prosopagnosia. Proc. Natl. Acad. Sci. ing (CAM) face-voice battery: test- Iidaka, T., Ozaki, N., Matsumoto, A., tion differed by individual dif-
U.S.A. 100, 1310513110. ing complex emotion recognition in Nogawa, J., Kinoshita, Y., Suzuki, ference. PLoS ONE 7, e32987.
Dolan, R. J., Morris, J. S., and de Gelder, adults with and without Asperger T., et al. (2005). A variant C178T doi:10.1371/journal.pone.0032987
B. (2001). Crossmodal binding of syndrome. J. Autism Dev. Disord. 36, in the regulatory region of the Lieberman, M. D., Eisenberger, N. I.,
fear in voice and face. Proc. Natl. 169183. serotonin receptor gene HTR3A Crockett, M. J., Tom, S. M., Pfeifer,
Acad. Sci. U.S.A. 98, 1000610010. Graham, R., and LaBar, K. S. (2007). modulates neural activation in the J. H., and Way, B. M. (2007). Putting
Dotsch, R., Wigboldus, D. H. J., Langner, Garner interference reveals depen- human amygdala. J. Neurosci. 25, feelings into words affect label-
O., and van Knippenberg, A. (2008). dencies between emotional expres- 64606466. ing disrupts amygdala activity in
Ethnic out-group faces are biased in sion and gaze in face perception. Iidaka, T., Saito, D. N., Komeda, H., response to affective stimuli. Psychol.
the prejudiced mind. Psychol. Sci. 19, Emotion 7, 296313. Mano, Y., Kanayama, N., Osumi, T., Sci. 18, 421428.
978980. Graham, R., and Labar, K. S. (2012). et al. (2010). Transient neural acti- Lindquist, K. A., Barrett, L. F., Bliss-
Eimer, M., and Holmes, A. (2002). An Neurocognitive mechanisms of vation in human amygdala involved Moreau, E., and Russell, J. A. (2006).
ERP study on the time course of gaze-expression interactions in face in aversive conditioning of face Language and the perception of
emotional face processing. Neurore- processing and social attention. and voice. J. Cogn. Neurosci. 22, emotion. Emotion 6, 125138.
port 13, 427431. Neuropsychologia 50, 553566. 20742085. Lucas, H. D., Chiao, J. Y., and Paller,
Ekman, P. (1992). An argument for basic Halberstadt, J. B., and Niedenthal, P. Ishai,A. (2008). Lets face it: its a cortical K. A. (2011). Why some faces
emotions. Cogn. Emot. 6, 169200. M. (2001). Effects of emotion con- network. Neuroimage 40, 415419. wont be remembered: brain
Elfenbein, H. A., and Ambady, N. cepts on perceptual memory for Jack, R. E., Blais, C., Scheepers, C., potentials illuminate successful
(2002). On the universality and cul- emotional expressions. J. Pers. Soc. Schyns, P. G., and Caldara, R. (2009). versus unsuccessful encoding
tural specificity of emotion recogni- Psychol. 81, 587598. Cultural Confusions show that facial for same-race and other-race
tion: a meta-analysis. Psychol. Bull. Hariri, A. R., Mattay, V. S., Tessitore, A., expressions are not universal. Curr. faces. Front. Hum. Neurosci. 5:20.
128, 203235. Kolachana, B., Fera, F., Goldman, D., Biol. 19, 15431548. doi:10.3389/fnhum.2011.00020
Engell, A. D., Haxby, J. V., and Todorov, et al. (2002). Serotonin transporter Jack, R. E., Garrod, O. G. B., Yu, H., Cal- Marzi, T., and Viggiano, M. P. (2010).
A. (2007). Implicit trustworthiness genetic variation and the response dara, R., and Schyns, P. G. (2012). When memory meets beauty:
decisions: automatic coding of face of the human amygdala. Science 297, Facial expressions of emotion are insights from event-related
properties in the human amygdala. 400403. not culturally universal. Proc. Natl. potentials. Biol. Psychol. 84,
J. Cogn. Neurosci. 19, 15081519. Harms, M. B., Martin, A., and Wal- Acad. Sci. U.S.A. 109, 72417244. 192205.
Ethofer, T., Pourtois, G., and Wildgru- lace, G. L. (2010). Facial emotion Jones, B. C., DeBruine, L. M., Little, A. Matsumoto, D. (2001). Culture and
ber, D. (2006). Investigating audiovi- recognition in autism spectrum dis- C., Burriss, R. P., and Feinberg, D. R. emotion, in The Handbook of Cul-
sual integration of emotional signals orders: a review of behavioral and (2007). Social transmission of face ture and Psychology, ed. D. Mat-
in the human brain. Prog. Brain Res. neuroimaging studies. Neuropsychol. preferences among humans. Proc. R. sumoto (New York: Oxford Univer-
156, 345361. Rev. 20, 290322. Soc. Lond. B. Biol. Sci. 274, 899903. sity Press), 171194.
Etkin, A., Klemenhagen, K. C., Dudman, Haxby, J. V., Hoffman, E. A., and Gob- Kanwisher, N., McDermott, J., and Matsumoto, D., Keltner, D., Shiota, M.
J. T., Rogan, M. T., Hen, R., Kandel, bini, M. I. (2000). The distributed Chun, M. M. (1997). The fusiform N., OSullivan, M., and Frank, M.
E. R., et al. (2004). Individual dif- human neural system for face per- face area: a module in human extras- (2008). Facial expressions of emo-
ferences in trait anxiety predict the ception. Trends Cogn. Sci. (Regul. triate cortex specialized for face per- tion, in Handbook of Emotions, eds
response of the basolateral amygdala Ed.) 4, 223232. ception. J. Neurosci. 17, 43024311. M. Lewis, J. M. Haviland-Jones, and
to unconsciously processed fearful Haxby, J. V., and Gobbini, M. I. Kim, H., Somerville, L. H., Johnstone, L. F. Barrett (New York: The Guilford
faces. Neuron 44, 10431055. (2011). Distributed neural systems T., Polis, S., Alexander, A. L., Shin, L. Press), 211234.
Ewbank, M. P., Fox, E., and Calder, for face perception, in The Oxford M., et al. (2004). Contextual mod- McArthur, L. Z., and Baron, R. M.
A. J. (2010). The interaction Handbook of Face Perception, eds ulation of amygdala responsivity to (1983). Toward an ecological theory
between gaze and facial expression A. J. Calder, G. Rhodes, M. H. surprised faces. J. Cogn. Neurosci. 16, of social perception. Psychol. Rev. 90,
in the amygdala and extended Johnson, and J. V. Haxby (New 17301745. 215.
amygdala is modulated by anx- York: Oxford University Press), Kreifelts, B., Ethofer, T., Grodd, W., McTeague, L. M., Shumen, J. R., Wieser,
iety. Front. Hum. Neurosci. 4:56. 93110. Erb, M., and Wildgruber, D. (2007). M. J., Lang, P. J., and Keil, A. (2011).
doi:10.3389/fnhum.2010.00056 Hoffman, E. A., and Haxby, J. V. Audiovisual integration of emo- Social vision: sustained perceptual
Fox, E., and Zougkou, K. (2011). Influ- (2000). Distinct representations of tional signals in voice and face: an enhancement of affective facial cues
ence of personality traits on process- eye gaze and identity in the distrib- event-related fMRI study. Neuroim- in social anxiety. Neuroimage 54,
ing of facial expressions, in Oxford uted human neural system for face age 37, 14451456. 16151624.
Handbook of Face Perception, eds A. perception. Nat. Neurosci. 3, 8084. Krolak-Salmon, P., Fischer, C., Vighetto, Meeren, H. K. M., van Heijnsbergen,
J. Calder, G. Rhodes, M. K. Johnson, Hugenberg, K. (2005). Social cate- A., and Mauguiere, F. (2001). C. C. R. J., and de Gelder, B.
and J. V. Haxby (New York: Oxford gorization and the perception of Processing of facial emotional (2005). Rapid perceptual integration
University Press), 515534. facial affect: target race moderates expression: spatio-temporal data of facial expression and emotional

www.frontiersin.org November 2012 | Volume 3 | Article 471 | 11


Wieser and Brosch Faces in context a comprehensive review

body language. Proc. Natl. Acad. Sci. Ofan, R. H., Rubin, N., and Amodio, and facial expressions: the effect of Transmitting and decoding facial
U.S.A. 102, 1651816523. D. M. (2011). Seeing race: N170 verbal interference. Mem. Cognit. 28, expressions. Psychol. Sci. 16,
Miskovic, V., and Schmidt, L. A. (2012). responses to race and their rela- 977986. 184189.
Social fearfulness in the human tion to automatic racial attitudes Rotshtein, P., Henson, R. N. A., Tamietto, M., and de Gelder, B. (2010).
brain. Neurosci. Biobehav. Rev. 36, and controlled processing. J. Cogn. Treves, A., Driver, J., and Dolan, Neural bases of the non-conscious
459478. Neurosci. 23, 31533161. R. J. (2005). Morphing marilyn perception of emotional signals.
Mobbs, D., Weiskopf, N., Lau, H. C., Phelps, E. A., OConnor, K. J., Cun- into maggie dissociates physical Nat. Rev. Neurosci. 11, 697709.
Featherstone, E., Dolan, R. J., and ningham, W. A., Funayama, E. S., and identity face representations Todorov, A. (2011). Evaluating faces
Frith, C. D. (2006). The Kuleshov Gatenby, J. C., Gore, J. C., et al. in the brain. Nat. Neurosci. 8, on social dimensions, in Social
effect: the influence of contex- (2000). Performance on indirect 107113. Neuroscience: Toward Understand-
tual framing on emotional attribu- measures of race evaluation pre- Russell, J. A., and Fehr, B. (1987). Rel- ing the Underpinnings of the Social
tions. Soc. Cogn. Affect. Neurosci. 1, dicts amygdala activation. J. Cogn. ativity in the perception of emotion Mind, eds A. Todorov, S. T.
95106. Neurosci. 12, 729738. in facial expressions. J. Exp. Psychol. Fiske, and D. A. Prentice (New
Montepare, J. M., and Dobish, H. Pourtois, G., de Gelder, B., Bol, A., Gen. 116, 223237. York: Oxford University Press),
(2003). The contribution of emo- and Crommelinck, M. (2005). Per- Said, C. P., Sebe, N., and Todorov, A. 5476.
tion perceptions and their overgen- ception of facial expressions and (2009). Structural resemblance to Todorov, A., Gobbini, M. I., Evans,
eralizations to trait impressions. J. voices and of their combination emotional expressions predicts eval- K. K., and Haxby, J. V. (2007).
Nonverbal Behav. 27, 237254. in the human brain. Cortex 41, uation of emotionally neutral faces. Spontaneous retrieval of affective
Morel, S., Beaucousin, V., Perrin, M., 4959. Emotion 9, 260264. person knowledge in face per-
and George, N. (2012). Very early Pourtois, G., de Gelder, B., Vroomen, Sander, D., Grandjean, D., Kaiser, S., ception. Neuropsychologia 45,
modulation of brain responses to J., Rossion, B., and Crommelinck, Wehrle, T., and Scherer, K. R. (2007). 163173.
neutral faces by a single prior asso- M. (2000). The time-course of inter- Interaction effects of perceived gaze Trautmann, S. A., Fehr, T., and Her-
ciation with an emotional context: modal binding between seeing and direction and dynamic facial expres- rmann, M. (2009). Emotions in
evidence from MEG. Neuroimage 61, hearing affective information. Neu- sion: evidence for appraisal theories motion: dynamic compared to sta-
14611470. roreport 11, 13291333. of emotion. Eur. J. Cogn. Psychol. 19, tic facial expressions of disgust and
Morris, J. S., Friston, K. J., Buechel, C., Price, J. L. (2007). Definition of the 470480. happiness reveal more widespread
Frith, C. D., Young, A. W., Calder, A. orbital cortex in relation to specific Sato, W., Kochiyama, T., Yoshikawa, emotion-specific activations. Brain
J., et al. (1998). A neuromodulatory connections with limbic and vis- S., Naito, E., and Matsumura, M. Res. 1284, 100115.
role for the human amygdala in pro- ceral structures and other cortical (2004). Enhanced neural activity in Trope, Y. (1986). Identification and
cessing emotional facial expressions. regions. Ann. N. Y. Acad. Sci. 1121, response to dynamic facial expres- inferential processes in disposi-
Brain 121, 4757. 5471. sions of emotion: an fMRI study. tional attribution. Psychol. Rev. 93,
Mhlberger, A., Wieser, M. J., Gerdes, Puce, A., Allison, T., Asgari, M., Gore, Brain Res. Cogn. Brain Res. 20, 239257.
A. B. M., Frey, M. C. M., Weyers, J. C., and McCarthy, G. (1996). Dif- 8191. Van den Stock, J., and de Gelder,
P., and Pauli, P. (2011). Stop look- ferential sensitivity of human visual Sato, W., and Yoshikawa, S. (2007a). B. (2012). Emotional informa-
ing angry and smile, please: start and cortex to faces, letterstrings, and tex- Enhanced experience of emotional tion in body and background
stop of the very same facial expres- tures: a functional magnetic reso- arousal in response to dynamic facial hampers recognition memory for
sion differentially activate threat- nance imaging study. J. Neurosci. 16, expressions. J. Nonverbal Behav. 31, faces. Neurobiol. Learn. Mem. 97,
and reward-related brain networks. 52055215. 119135. 321325.
Soc. Cogn. Affect. Neurosci. 6, Recio, G., Sommer, W., and Schacht, Sato, W., and Yoshikawa, S. (2007b). Van den Stock, J., Righart, R., and de
321329. A. (2011). Electrophysiological cor- Spontaneous facial mimicry in Gelder, B. (2007). Body expressions
Mhlberger, A., Wieser, M. J., Her- relates of perceiving and evaluat- response to dynamic facial expres- influence recognition of emotions
rmann, M. J., Weyers, P., Troger, C., ing static and dynamic facial emo- sions. Cognition 104, 118. in the face and voice. Emotion 7,
and Pauli, P. (2009). Early corti- tional expressions. Brain Res. 1376, Scherer, K. R. (1992). What does facial 487494.
cal processing of natural and artifi- 6675. expression express? Int. Rev. Stud. Viggiano, M. P., and Marzi, T. (2010).
cial emotional faces differs between Righart, R., and de Gelder, B. (2006). Emot. 2, 139165. Context and social effects on
lower and higher socially anxious Context influences early perceptual Scherer, K. R. (2003). Vocal commu- face recognition, in The Social
persons. J. Neural Transm. 116, analysis of faces an electrophys- nication of emotion: a review of Psychology of Visual Perception,
735746. iological study. Cereb. Cortex 16, research paradigms. Speech Com- eds E. Balcetis and G. D. Las-
Mller, V. I., Habel, U., Derntl, B., 12491257. mun. 40, 227256. siter (New York: Psychology Press),
Schneider, F., Zilles, K., Turetsky, Righart, R., and de Gelder, B. (2008a). Schupp, H. T., Ohman, A., Junghofer, 171200.
B. I., et al. (2011). Incongru- Rapid influence of emotional scenes M., Weike, A. I., Stockburger, J., and Vrticka, P., Andersson, F., Sander, D.,
ence effects in crossmodal emo- on encoding of facial expressions: an Hamm, A. O. (2004). The facilitated and Vuilleumier, P. (2009). Memory
tional integration. Neuroimage 54, ERP study. Soc. Cogn. Affect. Neu- processing of threatening faces: an for friends or foes: the social con-
22572266. rosci. 3, 270278. ERP analysis. Emotion 4, 189200. text of past encounters with faces
Mumenthaler, C., and Sander, D. (2012). Righart, R., and de Gelder, B. (2008b). Schwarz, K. A., Wieser, M. J., Gerdes, modulates their subsequent neural
Social appraisal influences recogni- Recognition of facial expressions A. B. M., Mhlberger, A., and Pauli, traces in the brain. Soc. Neurosci. 4,
tion of emotions. J. Pers. Soc. Psychol. is influenced by emotional scene P. (2012). Why are you looking like 384401.
102, 11181135. gist. Cogn. Affect. Behav. Neurosci. 8, that? How the context influences Vuilleumier, P., Richardson, M. P.,
NDiaye, K., Sander, D., and Vuilleumier, 264272. evaluation and processing of human Armony, J. L., Driver, J., and Dolan,
P. (2009). Self-relevance processing Righi, S., Marzi, T., Toscani, M., faces. Soc. Cogn. Affect. Neurosci. doi: R. J. (2004). Distant influences
in the human amygdala: gaze direc- Baldassi, S., Ottonello, S., and 10.1093/scan/nss013 of amygdala lesion on visual cor-
tion, facial expression, and emotion Viggiano, M. P. (2012). Fear- Sherman, A., Sweeny, T. D., Grabowecky, tical activation during emotional
intensity. Emotion 9, 798806. ful expressions enhance recognition M., and Suzuki, S. (2012). Laugh- face processing. Nat. Neurosci. 7,
Neta, M., Davis, F. C., and Whalen, memory: electrophysiological evi- ter exaggerates happy and sad faces 12711278.
P. J. (2011). Valence resolution of dence. Acta Psychol. (Amst.) 139, depending on visual context. Psy- Watanabe, N., Wada, M., Irukayama-
ambiguous facial expressions using 718. chon. Bull. Rev. 19, 163169. Tomobe, Y., Ogata, Y., Tsujino, N.,
an emotional oddball task. Emotion Roberson, D., and Davidoff, J. (2000). Smith, M. L., Cottrell, G. W., Gos- Suzuki, M., et al. (2012). A Sin-
11, 14251433. The categorical perception of colors selin, F., and Schyns, P. G. (2005). gle nucleotide polymorphism of

Frontiers in Psychology | Cognitive Science November 2012 | Volume 3 | Article 471 | 12


Wieser and Brosch Faces in context a comprehensive review

the neuropeptide B/W receptor-1 brain potentials. Biol. Psychol. 90, Wronka, E., and Walentowska, W. Citation: Wieser MJ and Brosch T (2012)
gene influences the evaluation of 242248. (2011). Attention modulates Faces in context: a review and sys-
facial expressions. PLoS ONE 7, Wieser, M. J., McTeague, L. M., and Keil, emotional expression pro- tematization of contextual influences on
e35390. doi:10.1371/journal.pone. A. (2012b). Competition effects of cessing. Psychophysiology 48, affective face processing. Front. Psychol-
0035390 threatening faces in social anxiety. 10471056. ogy 3:471. doi: 10.3389/fpsyg.2012.00471
Weyers, P., Mhlberger, A., Hefele, Emotion 12, 10501060. This article was submitted to Frontiers in
C., and Pauli, P. (2006). Wieser, M. J., McTeague, L. M., and Keil, Conflict of Interest Statement: The Cognitive Science, a specialty of Frontiers
Electromyographic responses to A. (2011). Sustained preferential authors declare that the research was in Psychology.
static and dynamic avatar emotional processing of social threat cues: bias conducted in the absence of any com- Copyright 2012 Wieser and Brosch.
facial expressions. Psychophysiology without competition? J. Cogn. Neu- mercial or financial relationships that This is an open-access article distributed
43, 450453. rosci. 23, 19731986. could be construed as a potential con- under the terms of the Creative Com-
Wieser, M. J., Gerdes, A. B. M., Wieser, M. J., Pauli, P., Reicherts, P., and flict of interest. mons Attribution License, which per-
Greiner, R., Reicherts, P., and Mhlberger, A. (2010). Dont look mits use, distribution and reproduction
Pauli, P. (2012a). Tonic pain at me in anger! Enhanced pro- Received: 30 August 2012; paper pending in other forums, provided the original
grabs attention, but leaves the cessing of angry faces in anticipation published: 26 September 2012; accepted: authors and source are credited and sub-
processing of facial expressions of public speaking. Psychophysiology 15 October 2012; published online: 02 ject to any copyright notices concerning
intact-Evidence from event-related 47, 241280. November 2012. any third-party graphics etc.

www.frontiersin.org November 2012 | Volume 3 | Article 471 | 13

You might also like