You are on page 1of 7

Emotion Review

http://emr.sagepub.com/

Effects of Dynamic Aspects of Facial Expressions: A Review


Eva G. Krumhuber, Arvid Kappas and Antony S. R. Manstead
Emotion Review 2013 5: 41
DOI: 10.1177/1754073912451349
The online version of this article can be found at:
http://emr.sagepub.com/content/5/1/41

Published by:
http://www.sagepublications.com

On behalf of:

International Society for Research on Emotion

Additional services and information for Emotion Review can be found at:
Email Alerts: http://emr.sagepub.com/cgi/alerts
Subscriptions: http://emr.sagepub.com/subscriptions
Reprints: http://www.sagepub.com/journalsReprints.nav
Permissions: http://www.sagepub.com/journalsPermissions.nav

>> Version of Record - Jan 2, 2013


What is This?

Downloaded from emr.sagepub.com by ancuta anca on October 25, 2014

451349

EMR5110.1177/1754073912451349Krumhuber et al.Emotion Review

2013

Effects of Dynamic Aspects of Facial Expressions:


A Review

Emotion Review
Vol. 5, No. 1 (January 2013) 4146
The Author(s) 2013
ISSN 1754-0739
DOI: 10.1177/1754073912451349
er.sagepub.com

Eva G. Krumhuber
Arvid Kappas

School of Humanities and Social Sciences, Jacobs University Bremen, Germany

Antony S. R. Manstead

School of Psychology, Cardiff University, UK

Abstract
A key feature of facial behavior is its dynamic quality. However, most previous research has been limited to the use of static images
of prototypical expressive patterns. This article explores the role of facial dynamics in the perception of emotions, reviewing
relevant empirical evidence demonstrating that dynamic information improves coherence in the identification of affect (particularly
for degraded and subtle stimuli), leads to higher emotion judgments (i.e., intensity and arousal), and helps to differentiate between
genuine and fake expressions. The findings underline that using static expressions not only poses problems of ecological validity,
but also limits our understanding of what facial activity does. Implications for future research on facial activity, particularly for
social neuroscience and affective computing, are discussed.

Keywords
dynamics, facial expression, motion, temporal

Facial behavior consists of dynamically changing configurations of morphological features as a function of unfolding patterns of underlying muscle activation. Since the publication of
Darwins The Expression of the Emotions in Man and Animals
(1872) there has been scientific interest in facial configurations
that are usually referred to as emotional expressions. Much of
this research has focused on particular patterns at (or very near)
the peak intensity of facial movement. There are two types of
questions about these patterns that are particularly relevant for
emotion research: (a) Do such peak expressions correlate with
specific feeling states, physiological responses? (b) Are such
peak expressions seen by observers as typical for the presence
of a particular emotion?
The former question is relatively easy to answer. There is
only a loose coupling between different components of emotions
facial behavior is not a reliable indicator of feeling states or vice
versa (e.g., Kappas, 2003; Mauss & Robinson, 2009). However,

it is quite likely that the prediction of subjective experience can


be improved when taking into account indicators of regulation,
low-intensity behavior, and particularly dynamic measures.
Answering the second question is more complicated. It relates to
inferences that are drawn in interaction about the affective states
of others. These inferences need not be correlated with the selfreport of an interaction partner. Here the question is rather
whether particular morphological features are associated with a
particular perception or interpretation (see also Russell,
Bachorowski, & Fernndez-Dols, 2003). In this line of research,
faces are typically presented in a visually isolated fashion (i.e.,
without a visual context) and without information about who is
depicted and what the situation was at the time the picture was
taken. To ensure that impressions or judgments are based on a
particular facial pattern, researchers often employ actors to portray stereotypes of emotions, or specific predefined patterns of
facial actions. In this case it is obvious that the encoders typically

Corresponding author: Eva Krumhuber, Research IV, Campus Ring 1, Jacobs University Bremen, 28759 Bremen, Germany. Email: e.krumhuber@jacobs-university.de

Downloaded from emr.sagepub.com by ancuta anca on October 25, 2014

42 Emotion Review Vol. 5 No. 1

do not feel the emotions they are asked to portray. Judges are
therefore asked to decode what the actor is supposed to be
expressing. Overall, this research has been very successful in
achieving its goals (see also Russell etal., 2003). Thus, it could
be shown that particular static patterns are associated with the
attribution of specific affective stateseven across cultural
boundaries. Despite certain criticisms of the methodology used
in this research (e.g., number of response alternatives provided;
see Russell, 1994), there is little doubt that there are stereotypical
patterns of facial activation that are interpreted as representing
particular states, for example, happiness, anger, or fear (see
Ekman, Friesen, & Ellsworth, 1972).
In everyday language, as well as in the scientific discourse,
the expression to recognize an emotion has been used confusingly but nevertheless consistently to mean attributing the label
that the researcher intended, based on previous research and/or
theory, rather than what the encoder actually felt at the moment.
Of course, this also applies to research using synthetic or artificial stimuli, such as line drawings, which cannot logically refer
to an underlying affective state.
The present article focuses on the question of whether and
how the comparatively neglected dynamic aspect of facial
behavior influences the perception of facial patterns, attribution
of emotion category labels, along with other aspects of emotion,
such as intensity, or authenticity. Facial motion may convey
information not only about the presence of an emotional state,
but also its unfolding and ending, which can provide strong signals of actions and intent for researchers interested in a nonintrusive measure of emotion, or for interaction partners who are
using verbal and nonverbal streams of information in a communication process (Kappas & Descteaux, 2003). Given that
the visual system evolved under dynamic conditions (see
Gibson, 1966), it seems reasonable to assume that we are highly
attuned to motion signals.
The primary objective of the present contribution is to
review existing evidence on the role played by facial dynamics
in the attributions of affective state. First, the effects of
dynamic information are examined with respect to the recognition of emotions as belonging to the intended emotion category. This is followed by an overview of the effects of
dynamics on emotion judgments more generally, and on
behavioral responses and intentions. Finally, some conclusions concerning the role of dynamic aspects are drawn, finishing with a discussion of the long-term theoretical benefits
of studying dynamic expressions.

Effects on Emotion Recognition Accuracy


Research using point-light or biological motion displays, line
drawings, schematic and computer-animated faces suggests
that movement enhances the accuracy of attribution of facial
affect (Bassili, 1978, 1979; Bruce & Valentine, 1988; Wallraven,
Breidt, Cunningham, & Blthoff, 2008; Wehrle, Kaiser,
Schmidt, & Scherer, 2000). These benefits of dynamic information are most evident when static information is limited

(e.g., through degradation in geometry, shape, or texture). The


beneficial effects of movement are somewhat weaker or redundant in natural or unmodified faces when complete spatial and
textural information is available (e.g., Fiorentini & Viviani,
2011; Kamachi etal., 2001, Experiment 2). For example, comparing the identification of basic emotions from two stimulus
types, Ktsyri and Sams (2008) and Ehrlich, Schiano, and
Sheridan (2000) found a recognition advantage for motion in
synthetic/schematic faces. However, no difference between
static and dynamic displays was observed for natural faces.
Similarly, in a study by Cunningham and Wallraven (2009a)
using stimuli of varying resolution (i.e., animated full-surface
faces, wireframe faces, and point-light faces), dynamic information generally led to higher recognition performance than
static displays, but this difference tended to be larger for pointlight faces with low spatial resolution. Motion therefore confers particular benefits when static information is inefficient or
unavailable, thereby mitigating the negative consequences of
degradation.
As well as the compensating role played by dynamic information under compromised conditions, a benefit is evident
when it comes to people who have neurological or developmental disorders (e.g., brain damage or autism). In neuropsychological studies, dynamic presentation significantly facilitated
emotion identification in adults and children who were unable
to identify or impaired in identifying intended emotional expressions from static displays (Back, Ropar, & Mitchell, 2007;
Harwood, Hall, & Shinkfield, 1999), supporting the assumption
that different neural pathways underpin responses to moving and
static stimuli (Adolphs, Tranel, & Damasio, 2003; Humphreys,
Donnelly, & Riddoch, 1993).
One possible explanation for the benefit conferred by
dynamic displays is that a moving sequence consists of multiple static images. Thus the effect of dynamic displays could
be attributed to an increase in the amount of static information. However, this is not the case. Using normal human
faces, Ambadar, Schooler, and Cohn (2005) showed that identification of subtle expressions was significantly better for
moving sequences compared to multistatic images that contained the same number of frames, but with a mask interspersed between each frame in order to disrupt the apparent
motion. Thus a dynamic sequence seems to provide a functionally distinct type of information that is not attributable to
additional static cues.
However, the intensity of the facial expression moderates the
extent to which moving displays result in emotion recognition
benefits. Using both subtle and intense expressions, Bould and
Morris (2008) demonstrated that the motion advantage for
dynamic as opposed to multistatic displays was reduced for
expressions of higher intensity (see also Kamachi etal., 2001,
Experiment 2; Wehrle etal., 2000). When expressions are
intense, it therefore seems that static faces are already strong
carriers of emotional signals by corresponding to the shared stereotypes, leaving little scope for improvement through the provision of dynamic information.

Downloaded from emr.sagepub.com by ancuta anca on October 25, 2014

Krumhuber et al. Facial Dynamics 43

In the case of lower intensity expressions it is worth considering how the provision of dynamic information helps perceivers to identify the emotion in question. Clearly, dynamics
should enable perceivers to observe how expressions change
over time. However, the role of motion extends beyond the
mere detection of what has changed in the face. As demonstrated by Bould, Morris, and Wink (2008, Experiment 1),
greater recognition benefits are afforded by dynamic moving
sequences than by showing only the first (neutral) and final
(peak) frame of an expression (but see Ambadar etal., 2005,
Experiment 2, for contrasting results). The critical advantage
seems to lie in the perception of the direction in which facial
expressions change. This is supported by evidence showing
that people are sensitive (even haptically; see Lederman etal.,
2007) to temporal development and can accurately reproduce
the temporal progression of a target persons expression from
a scrambled set of photographs (Edwards, 1998). Such adherence to temporal characteristics was found to be most apparent
in the early stages of the expression (see also Leonard, Voeller,
& Kuldau, 1991). By distorting the temporal direction,
Cunningham and Wallraven (2009b, Experiments 3 & 4) demonstrated that the recognition of dynamic expressions significantly decreased when the order of frames was scrambled or
reversed (played backwards). Thus, the dynamic advantage
does not seem to be solely due to the presence of motion signals, but also arises from diagnostic information embedded in
the temporal sequence of the expression.
Moreover, the quality of this embedded information plays an
important role in the visual processing of facial expressions.
Recent evidence suggests that linear motion animation, in which
facial changes occur in a linear manner (as in morphing), results
in slower and less accurate emotion recognition, as well as
lower judgments of intensity, sincerity, naturalness, and typicality, by comparison with nonlinear (i.e., naturally deforming)
facial motions of the same expressions (Cosker, Krumhuber, &
Hilton, 2010; Wallraven etal., 2008, Experiment 1). Other studies have revealed that the speed with which the face moves significantly affects emotion identification. When speeding up or
slowing down the velocity of dynamic change, observers performance and naturalness ratings varied in accordance with the type
of emotion displayed (Bould etal., 2008, Experiment 2; Hill,
Troje, & Johnston, 2005; Kamachi etal., 2001, Experiment 1;
Sato & Yoshikawa, 2004). These findings therefore suggest that
characteristics such as the direction, quality, and speed of
motion are distinctive features of dynamic information that
influence perceivers identification and discrimination of
emotional expressions.

Effects on Emotion Judgments and Behavioral


Responses
Apart from their beneficial role in emotion recognition, facial
dynamics have been shown to contribute to various aspects of
emotion judgments. For example, there is consistent evidence
that dynamic expressions are perceived as more intense and

realistic than static expressions (Biele & Grabowska, 2006;


Cunningham & Wallraven, 2009a; Weyers, Mhlberger, Hefele,
& Pauli, 2006). This perception of greater emotional intensity
might reflect the fact that dynamic change implies a forward
shift in the direction of the observed motion (commonly known
as representational momentum). In a study by Yoshikawa and
Sato (2008), participants perceived the final image of a dynamic
sequence as being more intense than it objectively was.
Moreover, as the velocity of facial movement increased, the
perceptual image of the facial expression intensified. Facial
dynamics may therefore lead to stronger emotional perceptions
by inducing larger forward displacements in the apparent
motion. Similar effects of dynamic expressions have been
demonstrated through ratings of experienced and recognized
emotional arousal (Sato, Fujimura, & Suzuki, 2008; Sato &
Yoshikawa, 2007a).
In addition to enhanced judgments of intensity and arousal,
observers are sensitive to dynamic information when judging
the authenticity of an expression. For example, shorter durations (i.e., onset, offset) and more irregular onset actions have
been found to be associated with judgments of politeness (rather
than amusement), and lower genuineness and spontaneity in the
case of smile expressions (Ambadar, Cohn, & Reed, 2009; Hess
& Kleck, 1994; Krumhuber & Kappas, 2005). There is also supportive evidence for the impact of temporal dynamics on person
ratings (Krumhuber, Manstead, & Kappas, 2007) and behavioral intentions and decisions of the observer. Specifically, children showed increased verbal responsiveness, and adults made
more favorable employment decisions and more cooperative
choices in response to smiles that had longer (compared to
shorter) onset and offset durations (Bugental, 1986; Krumhuber,
Manstead, Cosker, Marshall, & Rosin, 2009; Krumhuber,
Manstead, Cosker etal., 2007).
Several studies have reported stronger and more frequent
emotion-specific reactions to dynamic as opposed to static
expressions (Sato etal., 2008; Sato & Yoshikawa, 2007b;
Weyers etal., 2006). These imitative responses, interpretable as
facial mimicry, occurred spontaneously and rapidly (Vinter,
1986) and were found to play a significant role in detecting the
dynamic course of emotion facial expressions. For example, in
a study by Niedenthal, Brauer, Halberstadt, and Innes-Ker
(2001), participants who were prevented from mimicking took
significantly longer to detect the point at which an emotional
expression changed to a categorically different emotion (e.g.,
happiness changing into sadness, or vice versa), by comparison
with when they were allowed to mimic. Other work has shown
that spontaneous and deliberate smiles could be distinguished
from each other on the basis of dynamic displays, but not static
ones (Krumhuber & Manstead, 2009), and when participants
could freely mimic the expressions (Maringer, Krumhuber,
Fischer, & Niedenthal, 2011); when facial mimicry was blocked,
perceivers ratings of the genuineness of smiles did not distinguish between those that were more or less authentic in their
dynamic qualities. Facial mimicry may therefore help perceivers to detect the trajectory of dynamic displays and thereby
facilitate the perception of the emotion in question. Together,

Downloaded from emr.sagepub.com by ancuta anca on October 25, 2014

44 Emotion Review Vol. 5 No. 1

these findings suggest that temporal dynamics convey unique


information that is not only used for judging emotional expressions, but also drives behavior-specific responses and intentions
in the perceiver.

Summary and Outlook


A key feature of facial behavior is its dynamic nature.
Historically, facial expressions have been studied as slices of
behavior, frozen in time, presented out of context, portrayed by
encoders who did not actually experience the emotions they are
depicting. We have reviewed empirical evidence concerning the
role played by dynamic features in the perception of facial emotional behavior. The beneficial effect of dynamic information on
recognition accuracy, in the sense that decoders identified specific emotional states with greater coherence, was shown to be
particularly apparent for degraded and subtle expressive patterns, and occurred over and above the additional static information contained in moving displays. The direction, quality,
and speed of motion emerged as important components of this
dynamic information, with significant advantages afforded by
expressions which retained their original temporal sequences.
Dynamic displays were also shown to enhance emotional judgments (i.e., intensity and arousal), as well as influencing inferences about emotion authenticity, such as whether an expression
appears to be genuine or fake. Finally, these responses were
found to be facilitated by spontaneous facial mimicry by those
perceiving dynamic expressions.
Together, the findings provide strong support for the influential value of facial movements in emotional expressions. As
emotions unfold over time, dynamic displays of the temporal
sequence are not only of higher ecological validity, but also
evoke differential neural activation. In several neuroimaging
studies higher brain activity occurred in regions associated with
the processing of social- (superior temporal sulci) and emotionrelevant information (amygdalae) when viewing dynamic rather
than static expressive faces. Moreover, enhanced activation has
been observed in areas linked to perceiving motion (middle temporal gyri) and form-related aspects of faces (fusiform gyri), as
well as cognitive processes in general (inferior frontal gyri;
Kessler etal., 2011; Kilts, Egan, Gideon, Ely, & Hoffman, 2003;
LaBar, Crupain, Voyvodic, & McCarthy, 2003; Sato, Kochiyama,
Yoshikawa, Naito, & Matsumura, 2004; Schultz & Pilz, 2009;
for a review see Arsalidou, Morris, & Taylor, 2011). This neuroscientific evidence points to a neural network that facilitates
social interaction by helping us to understand others, identify
their needs, and predict their actions. It is worth noting that the
function of such a system may go well beyond a simple mirroring system that employs empathy as its key concept by blending
mind reading, simulation, empathy, with projected behaviors that
are also based on knowledge of others, the situational, and the
social context. Here, much research is needed that addresses
the complex reality of nonverbal behavior in context, rather than
the identification of stereotypical static patterns of facial actions.
To advance scientific knowledge of the communicative and
relational processes engaged by facial expressions, the systematic

study of the role of dynamic information should be a focus in


future research. This requires sophisticated approaches to the
measurement and analysis of temporal aspects of facial displays,
together with continuous measurements of self-report (e.g., Affect
Rating Dial by Ruef & Levenson, 2007; Dynamic Decoding
Device by Tcherkassof, Bollon, Dubois, Pansu, & Adam, 2007)
that allow subjective ratings to be made over time, as well as
physiological responses (see also Mauss & Robinson, 2009). As
facial expressions commonly appear alongside other verbal and
nonverbal cues (e.g., gaze, head orientation, gestures, speech),
emotion perception needs to be considered as a process in which
several dynamic acts are temporally integrated to produce meaning. It is these dynamic patterns of facial expressions that demand
future attention and multilevel analysis (see Krumhuber &
Scherer, 2011; With & Kaiser, 2011). The broader issue is what
nonverbal behavior does in interaction (Kappas & Descoteaux,
2003). Arguably, facial behavior can be understood as serving a
variety of functions that include intra- and interpersonal emotional regulation (Butler, 2011; Kappas, 2011). From this perspective, the cohesion of subjective feeling state and expression
may be often low because facial behavior is not a running commentary on what we feel, but also relates to conscious and unconscious attempts to influence what we feel, what others think we
want, what we want others to feel, etcetera. In colloquial terms:
facial behaviors do things to us and to others. We will fail to
arrive at a proper understanding of what faces do if we continue
to use static snapshots of faces as a paradigm for researching
facial expressions.
With the emergence of technology in the field of affective
computing, dynamic properties promise to be key factors in the
automatic extraction and resynthesis of realistic human behavior. The first advances have already been made by incorporating
dynamic data into the measurement and modeling of facial
actions (e.g., Cosker, Krumhuber, & Hilton, 2011; Pantic &
Patras, 2006; see also Calvo & DMello, 2010; Kappas, 2010).
It falls to future research to make use of these techniques and to
treat dynamic information as an integral part of facial behavior.
Once the tradition of employing highly intense and prototypical
static expressions has been overcome, we will be able to grasp
the true nature and function of facial actions as they take place
in everyday interactions.

References
Adolphs, R., Tranel, D., & Damasio, A. R. (2003). Dissociable neural
systems for recognizing emotions. Brain and Cognition, 52, 6169.
Ambadar, Z., Cohn, J. F., & Reed, L. I. (2009). All smiles are not created
equal: Morphology and timing of smiles perceived as amused, polite,
and embarrassed/nervous. Journal of Nonverbal Behavior, 33, 1734.
Ambadar, Z., Schooler, J., & Cohn, J. (2005). Deciphering the enigmatic
face: The importance of facial dynamics in interpreting subtle facial
expressions. Psychological Science, 16, 403410.
Arsalidou, M., Morris, D., & Taylor, M. J. (2011). Converging evidence
for the advantage of dynamic facial expressions. Brain Topography,
24, 149163.
Back, E., Ropar, D., & Mitchell, P. (2007). Do the eyes have it? Inferring
mental states from animated faces in autism. Child Development, 78,
397411.

Downloaded from emr.sagepub.com by ancuta anca on October 25, 2014

Krumhuber et al. Facial Dynamics 45

Bassili, J. N. (1978). Facial motion in the perception of faces and of


emotional expression. Journal of Experimental Psychology: Human
Perception and Performance, 4, 373379.
Bassili, J. N. (1979). Emotion recognition: The role of facial movement and
the relative importance of upper and lower areas of the face. Journal of
Personality and Social Psychology, 37, 20492058.
Biele, C., & Grabowska, A. (2006). Sex differences in perception of emotion intensity in dynamic and static facial expressions. Experimental
Brain Research, 26, 16.
Bould, E., & Morris, N. (2008). Role of motion signals in recognizing subtle facial expressions of emotion. British Journal of Psychology, 99,
167189.
Bould, E., Morris, N., & Wink, B. (2008). Recognising subtle emotional
expressions: The role of facial movements. Cognition & Emotion, 22,
15691587.
Bruce, V., & Valentine, T. (1988). When a nods as good as a wink: The
role of dynamic information in facial recognition. In M. M. Gruneberg,
P. E. Morris, & R. N. Sykes (Eds.), Practical aspects of memory: Current research and issues (Vol. 1, pp. 169174). New York, NY: John
Wiley & Sons.
Bugental, D. B. (1986). Unmasking the polite smile: Situational and
personal determinants of managed affect in adultchild interaction.
Personality and Social Psychology Bulletin, 12, 716.
Butler, E. A. (2011). Temporal interpersonal emotion systems: The TIES
that form relationships. Personality and Social Psychology Review, 15,
367393.
Calvo, R. A., & DMello, S. K. (2010). Affect detection: An interdisciplinary
review of models, methods, and their applications. IEEE Transactions
on Affective Computing, 1, 1837.
Cosker, D., Krumhuber, E., & Hilton, A. (2010). Perception of linear and
nonlinear motion properties using a FACS validated 3D facial model.
In D. Gutierrez, J. Kearney, M. S. Banks, & K. Mania (Eds.), Proceedings
of the Symposium on Applied Perception in Graphics and Visualization
(APGV) (pp. 101108). New York, NY: ACM.
Cosker, D., Krumhuber, E., & Hilton, A. (2011). A FACS valid 3D dynamic
action unit database with applications to 3D dynamic morphable facial
modeling. In D. Metaxas, L. Quan, A. Sanfeliu, & L. van Gool (Eds.),
Proceedings of the 13th International Conference on Computer Vision
(ICCV) (pp. 22962303). Retrieved from http://www.computer.org/
portal/web/csdl/doi/10.1109/ICCV.2011.6126510
Cunningham, D. W., & Wallraven, C. (2009a). The interaction between
motion and form in expression recognition. In B. Bodenheimer &
C. OSullivan (Eds.), Proceedings of the 6th Symposium on Applied
Perception in Graphics and Visualization (APGV2009) (pp. 4144).
New York, NY: ACM.
Cunningham, D. W., & Wallraven, C. (2009b). Dynamic information for the
recognition of conversational expressions. Journal of Vision, 9, 117.
Darwin, C. (1872). The expression of the emotions in man and animals.
London, UK: John Murray.
Edwards, K. (1998). The face of time: Temporal cues in facial expressions
of emotion. Psychological Science, 9, 270276.
Ehrlich, S. M., Schiano, D. J., & Sheridan, K. (2000). Communicating facial
affect: Its not the realism, its the motion. In G. Szwillus & T. Turner
(Eds.), Proceedings of the ACM CHI 2000 Conference on Human Factors
in Computing Systems (pp. 252253). New York, NY: ACM.
Ekman, P., Friesen, W. V., & Ellsworth, P. (1972). Emotion in the human
face: Guidelines for research and an integration of findings. New
York, NY: Pergamon Press.
Fiorentini, C., & Viviani, P. (2011). Is there a dynamic advantage for facial
expressions? Journal of Vision, 11, 115.
Gibson, J. J. (1966). The senses considered as perceptual systems. Boston,
MA: Houghton Mifflin.
Harwood, N. K., Hall, L. J., & Shinkfield, A. J. (1999). Recognition of facial
emotional expressions from moving and static displays by individuals

with mental retardation. American Journal of Mental Retardation, 104,


270278.
Hess, U., & Kleck, R. E. (1994). The cues decoders use in attempting to
differentiate emotion-elicited and posed facial expressions. European
Journal of Social Psychology, 24, 367381.
Hill, H. C. H., Troje, N. F., & Johnston, A. (2005). Range- and domainspecific exaggeration of facial speech. Journal of Vision, 5, 793807.
Humphreys, G. W., Donnelly, N., & Riddoch, M. J. (1993). Expression is
computed separately from facial identity, and it is computed separately
for moving and static faces: Neuropsychological evidence. Neuropsychologia, 31, 173181.
Kamachi, M., Bruce, V., Mukaida, S., Gyoba, J., Yoshikawa, S., &
Akamatsu, S. (2001). Dynamic properties influence the perception of
facial expressions. Perception, 30, 875887.
Kappas, A. (2003). What facial activity can and cannot tell us about emotions. In M. Katsikitis (Ed.), The human face: Measurement and meaning (pp. 215234). Dordrecht, The Netherlands: Kluwer Academic
Publishers.
Kappas, A. (2010). Smile when you read this, whether you like it or not:
Conceptual challenges to affect detection. IEEE Transactions on Affective
Computing, 1, 3841.
Kappas, A. (2011). Emotion and regulation are one! Emotion Review, 3,
1725.
Kappas, A., & Descteaux, J. (2003). Of butterflies and roaring thunder:
Nonverbal communication in interaction and regulation of emotion. In
P. Philippot, E. J. Coats & R. S. Feldman (Eds.), Nonverbal behavior in
clinical settings (pp. 4574). New York, NY: Oxford University Press.
Ktsyri, J., & Sams, M. (2008). The effect of dynamics on identifying basic
emotions from synthetic and natural faces. International Journal of
HumanComputer Studies, 66, 233242.
Kessler, H., Doyen-Waldecker, C., Hofer, C., Hoffmann, H., Traue, H. C.,
& Abler, B. (2011). Neural correlates of the perception of dynamic versus static facial expressions of emotion. Psychosocial Medicine, 8, 18.
Kilts, C. D., Egan, G., Gideon, D. A., Ely, T. D., & Hoffman, J. M. (2003).
Dissociable neural pathways are involved in the recognition of emotion
in static and dynamic facial expressions. Neuroimage, 18, 158168.
Krumhuber, E., & Kappas, A. (2005). Moving smiles: The role of dynamic
components for the perception of the genuineness of smiles. Journal of
Nonverbal Behavior, 29, 324.
Krumhuber, E., & Manstead, A. S. R. (2009). Can Duchenne smiles be
feigned? New evidence on felt and false smiles. Emotion, 9, 807820.
Krumhuber, E., Manstead, A. S. R., Cosker, D., Marshall, D., & Rosin, P. L.
(2009). Effects of dynamic attributes of smiles in human and synthetic
faces: A simulated job interview setting. Journal of Nonverbal Behavior,
33, 115.
Krumhuber, E., Manstead, A. S. R., Cosker, D., Marshall, D., Rosin, P. L.,
& Kappas, A. (2007). Facial dynamics as indicators of trustworthiness
and cooperative behavior. Emotion, 7, 730735.
Krumhuber, E., Manstead, A. S. R., & Kappas, A. (2007). Temporal aspects
of facial displays in person and expression perception: The effects of
smile dynamics, head-tilt and gender. Journal of Nonverbal Behavior,
31, 3956.
Krumhuber, E., & Scherer, K. R. (2011). Affect bursts: Dynamic patterns of
facial expression. Emotion, 11, 825841.
LaBar, K. S., Crupain, M. J., Voyvodic, J. T., & McCarthy, G. (2003).
Dynamic perception of facial affect and identity in the human brain.
Cerebral Cortex, 13, 10231033.
Lederman, S. J., Klatzky, R. L., Abramowicz, A., Salsman, K., Kitada,
R., & Hamilton, C. (2007). Haptic recognition of static and dynamic
expressions of emotion in the life face. Psychological Science, 18,
158164.
Leonard, C. M., Voeller, K. K., & Kuldau, J. M. (1991). Whens a smile a
smile? Or how to detect a message by digitizing the signal. Psychological Science, 2, 166172.

Downloaded from emr.sagepub.com by ancuta anca on October 25, 2014

46 Emotion Review Vol. 5 No. 1

Maringer, M., Krumhuber, E., Fischer, A. H., & Niedenthal, P. M. (2011).


Beyond smile dynamics: Mimicry and beliefs in judgments of smiles.
Emotion, 11, 181187.
Mauss, I. B., & Robinson, M. D. (2009). Measures of emotion: A review.
Cognition & Emotion, 23, 209237.
Niedenthal, P. M., Brauer, M., Halberstadt, J. B., & Innes-Ker, . H.
(2001). When did her smile drop? Facial mimicry and the influence
of emotional state on the detection of change in emotional expression.
Cognition & Emotion, 15, 853864.
Pantic, M., & Patras, I. (2006). Dynamics of facial expression: Recognition
of facial actions and their temporal segments from face profile image
sequences. IEEE Transactions on Systems, Man, and Cybernetics,
Part B: Cybernetics, 36, 433449.
Ruef, A. M., & Levenson, R. W. (2007). Continuous measurement of emotion. In J. A. Coan & J. B. Allen (Eds.), Handbook of emotion elicitation
and assessment (pp. 286297). New York, NY: Oxford University Press.
Russell, J. A. (1994). Is there universal recognition of emotion from facial
expression? A review of the cross-cultural studies. Psychological Bulletin,
115, 102141.
Russell, J. A., Bachorowski, J. A., & Fernndez-Dols, J. M. (2003). Facial and
vocal expression of emotion. Annual Review of Psychology, 54, 329349.
Sato, W., Fujimura, T., & Suzuki, N. (2008). Enhanced facial EMG activity in response to dynamic facial expressions. International Journal of
Psychophysiology, 70, 7074.
Sato, W., Kochiyama, T., Yoshikawa, S., Naito, E., & Matsumura, M.
(2004). Enhanced neural activity in response to dynamic facial expressions of emotion: An fMRI study. Cognitive Brain Research, 20, 8191.
Sato, W., & Yoshikawa, S. (2004). The dynamic aspects of emotional facial
expressions. Cognition & Emotion, 18, 701710.

Sato, W., & Yoshikawa, S. (2007a). Enhanced experience of emotional


arousal in response to dynamic facial expressions. Journal of Nonverbal Behavior, 31, 119135.
Sato, W., & Yoshikawa, S. (2007b). Spontaneous facial mimicry in response
to dynamic facial expressions. Cognition, 104, 118.
Schultz, J., & Pilz, K. S. (2009). Natural facial motion enhances cortical
responses to faces. Experimental Brain Research, 194, 465475.
Tcherkassof, A., Bollon, T., Dubois, M., Pansu, P., & Adam, J. M. (2007).
Facial expressions of emotions: A methodological contribution to the
study of spontaneous and dynamic emotional faces. European Journal
of Social Psychology, 37, 13251345.
Vinter, A. (1986). The role of movement in eliciting early imitations. Child
Development, 57, 6671.
Wallraven, C., Breidt, M., Cunningham, D. W., & Blthoff, H. H. (2008).
Evaluating the perceptual realism of animated facial expressions. ACM
Transactions on Applied Perception, 4, 120.
Wehrle, T., Kaiser, S., Schmidt, S., & Scherer, K. R. (2000). Studying
the dynamics of emotional expression using synthesized facial muscle movements. Journal of Personality and Social Psychology, 78,
105119.
Weyers, P., Mhlberger, A., Hefele, C., & Pauli, P. (2006). Electromyographic
responses to static and dynamic avatar emotional facial expressions.
Psychophysiology, 43, 14.
With, S., & Kaiser, S. (2011). Sequential patterning of facial actions in the
production and perception of emotional expressions. Swiss Journal of
Psychology, 70, 241252.
Yoshikawa, S., & Sato, W. (2008). Dynamic facial expressions of emotion
induce representational momentum. Cognitive, Affective, & Behavioral
Neuroscience, 8, 2531.

Downloaded from emr.sagepub.com by ancuta anca on October 25, 2014

You might also like