You are on page 1of 10

See

discussions, stats, and author profiles for this publication at: http://www.researchgate.net/publication/230800622

How Mood States Affect Information


Processing During Facial Emotion Recognition:
An Eye Tracking Study

ARTICLE in SWISS JOURNAL OF PSYCHOLOGY · DECEMBER 2011


Impact Factor: 0.57 · DOI: 10.1024/1421-0185/a000060

CITATIONS DOWNLOADS VIEWS

10 988 277

5 AUTHORS, INCLUDING:

Marianne Schmid Mast Dario Bombari


University of Lausanne University of Lausanne
92 PUBLICATIONS 1,001 CITATIONS 9 PUBLICATIONS 39 CITATIONS

SEE PROFILE SEE PROFILE

Fred W Mast Janek S Lobmaier


Universität Bern Universität Bern
110 PUBLICATIONS 1,264 CITATIONS 40 PUBLICATIONS 360 CITATIONS

SEE PROFILE SEE PROFILE

Available from: Marianne Schmid Mast


Retrieved on: 16 June 2015
Swiss Journal of Psychology, 70 (4), 2011, 223–231

Sw issJ. P.Psychol.
C. Schmid et ©
70 (4) al.:2011
HowbyMoods
Verlag Affect Information
Hans Huber, HogrefeProcessing
AG, Bern

Original Communication

How Mood States Affect Information


Processing During Facial Emotion
Recognition: An Eye Tracking Study
Petra C. Schmid1, Marianne Schmid Mast1, Dario Bombari2,
Fred W. Mast2, and Janek S. Lobmaier2
1
Department of Work and Organizational Psychology, University of Neuchatel, Switzerland
2
Department of Psychology, University of Bern, Switzerland

Abstract. Existing research shows that a sad mood hinders emotion recognition. More generally, it has been shown that mood affects
information processing. A happy mood facilitates global processing and a sad mood boosts local processing. Global processing has been
described as the Gestalt-like integration of details; local processing is understood as the detailed processing of the parts. The present
study investigated how mood affects the use of information processing styles in an emotion recognition task. Thirty-three participants
were primed with happy or sad moods in a within-subjects design. They performed an emotion recognition task during which eye
movements were registered. Eye movements served to provide information about participants’ global or local information processing
style. Our results suggest that when participants were in a happy mood, they processed information more globally compared to when
they were in a sad mood. However, global processing was only positively and local processing only negatively related to emotion
recognition when participants were in a sad mood. When they were in a happy mood, processing style was not related to emotion
recognition performance. Our findings clarify the mechanism that underlies accurate emotion recognition, which is important when one
is aiming to improve this ability (i.e., via training).

Keywords: emotion recognition, information processing, mood, gender

Introduction expressions. Sad, compared to happy, people tend to per-


ceive more sadness and less happiness in faces (Bouhuys,
Facial emotional expressions are important cues in social Bloem, & Groothuis, 1995; Niedenthal, Halberstadt, Mar-
communication. Facial expressions coordinate social inter- golin, & Innes-Ker, 2000). Such mood-congruity effects
actions by providing information about the sender’s emo- have also been reported for emotion recognition accuracy:
tional and intentional states and/or the nature of the rela- Sad people showed a decrease in recognition accuracy for
tionship between the two interaction partners. Although the happy emotions, whereas happy people showed a decrease
ability to correctly recognize other people’s emotions is in recognition accuracy for sad emotions (Schmid &
crucial to experience positive social interactions (Keltner Schmid Mast, 2010). Schmid and Schmid Mast’s stimulus
& Kring, 1998), the mechanism underlying accurate emo- material only included happy and sad facial expressions.
tion recognition is still unclear. We know that, in general, Chepenik and colleagues (Chepenik, Cornew, & Farah,
people reliably recognize emotions from faces, yet this 2007) used stimulus material consisting of neutral, happy,
ability seems to vary between individuals (Herba & Phil- sad, angry, and fearful facial expressions. They found that
lips, 2004; Moore, 2001). a sad mood compared to a neutral mood had a detrimental
A large body of research reports that people suffering effect on overall emotion recognition accuracy (no mood-
from affective disorders such as depression or mania show congruity effects). Because there was no happy mood con-
decreased emotion recognition accuracy (e.g., Bouhuys, dition in the Chepenik et al. (2007) study, we do not know
Geerts, & Gordijn, 1999; Heimberg, Gur, Erwin, Shtasel, how a happy mood influences emotion recognition accu-
& Gur, 1992; Lembke & Ketter, 2002; Surguladze et al., racy when more than two emotional expressions are in-
2004; Zuroff & Colussy, 1986). Not only affective disor- cluded. Moreover, the mechanism underlying how mood
ders, but also normal variations in mood states in healthy affects emotion recognition accuracy remains poorly un-
people have consequences for the perception of emotional derstood. Our goal was to obtain a more refined under-

DOI 10.1024/1421-0185/a000060 Swiss J. Psychol. 70 (4) © 2011 by Verlag Hans Huber, Hogrefe AG, Bern
224 P. C. Schmid et al.: How Moods Affect Information Processing

standing of how correct emotion recognition works. In the as the intereye distance or the distance between the nose
present paper, we test the assumption that emotion recog- and the mouth. Similar to global processing, perceiving
nition accuracy depends on the use of global versus local configural information allows one to recognize large co-
information processing styles and that people in a happy herent units or global forms (Brosnan, Scott, Fox, & Pye,
or a sad mood use different information processing styles. 2004). The face inversion effect found in emotion recog-
Previous studies have reported that women usually out- nition, therefore, supports the idea that a global processing
perform men in emotion recognition tasks (McClure, style aids emotion recognition.
2000, for a meta-analysis). Here, we test the assumption It is possible that one processing style may be more ap-
that this gender difference may be explained by different propriate than the other for one emotion, but not for anoth-
processing styles. These predictions will be outlined in the er. McKelvie (1995) found that the recognition of anger,
next sections. disgust, surprise, sadness, and fear was hindered when the
faces were inverted. Only recognition of happiness was not
affected by face inversion. Prkachin (2003) showed that the
Information Processing During Emotion recognition of anger, disgust, and fear was more affected
by face inversion than the recognition of happiness, sad-
Recognition ness, and surprise. Configural information might, there-
fore, play an important role in the recognition of most emo-
Previous research suggests that when people are recogniz- tions. Happiness seems to be an exception, as it was still
ing emotions, they process faces globally rather than local- recognized accurately after configural or global informa-
ly, meaning that the face is more likely to be processed as tion had been reduced. An explanation for this might be
a whole and not in a piecemeal fashion. Calder, Young, that happy faces are generally easier to recognize than other
Keane, and Dean (2000) cut faces expressing different emotions and that very little information is needed for rec-
emotions in two halves. The cut was done horizontally be- ognition (happy face advantage; e.g., Leppänen & Hieta-
low the eyes, dividing the face into an upper and lower part. nen, 2004). Indeed, detecting a smiling mouth may suffice
The upper part of one face was then combined with a lower to recognize happiness.
part of another face in such a manner that the upper and the
lower face part contained different emotional expressions.
The upper and lower parts were either aligned so that the
whole represented a new, intact face (composite), or the Mood Effects on Information Processing
two face parts were presented shifted apart on the horizon-
tal axis (noncomposite). Participants were instructed to ei- The affect-as-information theory (Clore et al., 2001;
ther focus on the upper or the lower part of the composite Schwarz, 1990) suggests that people in a sad mood process
or noncomposite faces and to name the emotion expressed information more deliberately and search for specific in-
in the respective face part (the other face part was to be formation before making a judgment. People in a happy
ignored). Participants took longer to recognize the emo- mood use a more automatic or heuristic information pro-
tions when the faces were composite than when they were cessing style and judgments are made on the basis of an
noncomposite. The authors concluded that participants overall impression. This theory is supported by empirical
were slower in the composite condition because they pro- findings (De Vries, Holland, & Witteman, 2008; Forgas,
cessed the relevant and irrelevant parts as a whole, and the 1992). Gasper and Clore (2002) assumed that a differential
irrelevant part, thus, interfered with the recognition of the use of information processing styles has consequences for
emotion in the relevant half. This finding can be seen as global versus local processing of visual stimuli. The au-
evidence of a global information processing style when rec- thors asked happy and sad participants to perform the glob-
ognizing emotions. al-local focus test introduced by Kimchi and Palmer
Further support for the assumption that a global infor- (1982). This test contains hierarchical figures consisting of
mation processing style is important for emotion recogni- small triangles or squares (local attributes) forming a larger
tion comes from studies on the face inversion effect. triangle or square (global attribute). A standard figure was
Prkachin (2003) showed that emotional expressions were presented simultaneously with two other figures. In one of
more difficult to recognize when the faces were inverted these figures, the global attribute was the same as the stand-
than when the faces were upright. Given that several stud- ard figure and, in the other figure, the local attribute was
ies have shown that inverting a face hampers configural the same as the standard figure. The task was to indicate
processing (e.g., Freire, Lee, & Symons, 2000; Searcy & which of the two figures was more similar to the standard
Bartlett, 1996; Yin, 1969), inverted facial expression of figure. Results showed that sad people were more likely to
emotions might have been recognized less easily because rely on the local attributes (and, consequently, less likely
configural – or global – processing was not possible. Sear- to rely on the global attributes) than happy people. This
cy and Bartlett (1996) argued that configural information finding is consistent with the idea that a happy mood trig-
is comparable to global information, as configural infor- gers global information processing whereas a sad mood
mation refers to spatiorelational information in faces, such triggers local information processing.

Swiss J. Psychol. 70 (4) © 2011 by Verlag Hans Huber, Hogrefe AG, Bern
P. C. Schmid et al.: How Moods Affect Information Processing 225

Gender Effects (e.g., Gasper & Clore, 2002), we predicted that participants
in a happy mood would be more accurate than those in a
Women typically outperform men in emotion recognition sad mood (Hypothesis 3). Based on the meta-analysis by
accuracy (McClure, 2000; for a meta-analysis). Moreover, McClure (2000), we hypothesized that women would rec-
there is evidence that men and women process information ognize emotions more accurately than men (Hypothesis 4).
differently when recognizing emotions. Hall, Witelson, If the mechanism explaining the gender difference in emo-
Szechtman, and Nahmias (2003) investigated the brain ac- tion recognition indeed lies in different information pro-
tivation of men and women during emotion recognition. cessing styles of women and men (Hall et al., 2003), we
They found less limbic activity and greater left hemispheric would expect women to process information more globally
frontal cortical activity in men than in women. According (Hypothesis 5a) and less locally (Hypothesis 5b) than men.
to Damasio (1994) and Ledoux (2000), the limbic system Finally, we tested whether there were differences between
is associated with reactions to primary emotions (fast and the different emotional expressions without putting for-
innate reaction to threatening stimuli) while prefrontal ac- ward hypotheses, because of the lack of consistency in em-
tivation is associated with reactions to secondary emotions pirical findings (McKelvie, 1995; Prkachin, 2003).
(learned and controlled reactions). Referring to these find-
ings, Hall et al. argued that men use a less automatic and
more analytic information processing style than women
when recognizing emotions. Because automatic processing
Method
is triggered by global processing and analytic processing Participants
by local processing (Förster & Dannenberg, 2010), one
might conclude that men generally tend to engage more in Participants were 34 university students (15 males, 19 fe-
item-specific (local) processing and women more in rela- males). One male participant did not come back for the
tional (global) processing (e.g., Gilligan, 1982; Putrevu, second testing session and was excluded from the analyses.
2001). Hence, women might outperform men in emotion Because of calibration problems we had to remove four
recognition because they use a (global) processing style female participants from the analyses on the eye tracking
that is beneficial for correct emotion recognition whereas measures, resulting in N = 29 (14 male, 15 female) partic-
men use a less advantageous (local) processing style. ipants. All participants were native French speakers re-
cruited at the University of Lausanne (Mage = 23.18, SD =
3.43). They were remunerated with 30 Swiss francs. All
The Present Study participants reported normal or corrected-to-normal vision
and were right-handed.
In the present paper, we examine whether the use of global
versus local information processing styles influences the
accuracy of emotion recognition and whether the relation- Stimuli and Materials
ship between processing style and accurate emotion recog- Mood Priming
nition depends on the perceiver’s mood and gender. Eye
tracking measures were used to assess participants’ infor- Happy and sad moods were primed with short film scenes.
mation processing styles. We analyzed whether a global A meta-analysis showed that film scenes are the most ef-
eye scan pattern (enabling an integrative, holistic impres- fective and ethically sound way of priming mood (Wester-
sion of the face) or a local eye scan pattern (allowing a mann, Spies, Stahl, & Hesse, 1996). Happy mood was
deeper analysis of the detailed facial information, such as primed with a scene from the movie “When Harry Met
the eyes, the mouth, and the nose) is more beneficial for Sally” in which the actress simulates an orgasm. Sad mood
accurate emotion recognition. The following predictions was primed with a scene from the movie “The Champ” in
were tested: Consistent with the literature stressing the role which a little boy cries at his father’s deathbed. As the par-
of global processing in emotion recognition (Calder et al., ticipants were French speaking, we presented the French
2000; Prkachin, 2003), we predicted that the more global versions of these film scenes, which have also been shown
the processing style, the better the emotion recognition per- to effectively prime mood states (Schaefer, Nils, Sanchez,
formance (Hypothesis 1a). Conversely, the more local the & Phillippot, 2010). The film scenes were chosen based on
processing style, the worse the performance on the emotion the validation and recommendation of Rottenberg, Ray,
recognition task would be (Hypothesis 1b). Further, we ex- and Gross (2007) and both scenes were 2.46 min long.
pected participants in a happy mood to use a more global
processing style (Hypothesis 2a) and a less local processing
style (Hypothesis 2b) than participants in a sad mood (e.g., Emotional Faces
De Vries et al., 2008; Gasper & Clore, 2002; Schwarz,
1990). Because we hypothesized that participants in a hap- We used stimuli from the DANVA 2-AF (Nowicki & Duke,
py mood use the more beneficial, global processing style 1994) and the Ekman and Friesen (1976) series. The

Swiss J. Psychol. 70 (4) © 2011 by Verlag Hans Huber, Hogrefe AG, Bern
226 P. C. Schmid et al.: How Moods Affect Information Processing

sponse keys (happy, sad, angry, and fearful) for the emotion
recognition task. The participants learned the sequence of
the keys by heart so that they would not have to look down
at the keys during eye tracking. They were then assigned
to one of two mood primings, happy or sad. Happy and sad
mood priming were both achieved by showing the partici-
pants fragments of films. To check whether mood priming
was successful, we asked the participants “How do you feel
at this moment?” (on a 6-point Likert scale of 1 = extremely
Figure 1. A selection of the modified stimuli used in the sad to 6 = extremely happy) after they had watched the
emotion recognition task (left: fearful expression; right: films. Participants then performed Part 1 of the emotion
happy expression). recognition task, in which participants indicated by button
press which emotion the presented face expressed. Each
DANVA consists of 24 pictures of emotional facial expres- face was presented in the center of the screen for 4 s, fol-
sions: happiness, fear, anger, and sadness. These 24 stimuli lowed by a 5 s mask (a face shape). Participants were al-
were combined with 40 stimuli from the Ekman and Frie- lowed to respond during stimulus exposure or during the
sen series of basic emotions expressing the same four emo- mask period (the answer time frame, thus, totaled 9 s). Part
tions. The stimulus sets were divided into two subsets (one 2 was presented to the participants in the second session,
for each of the two testing sessions). Each subset consisted in which they were primed with the alternative mood (sad,
of 12 stimuli from the DANVA and 20 stimuli from the if they saw a happy film in Session 1 and happy, if they
Ekman and Friesen series containing equal numbers of pic- saw a sad film in Session 1). Participants’ key presses and
tures of each emotion. The pictures were 14 × 21 cm in size, eye movements were recorded in both sessions.
so the visual angle subtended approximately 13 ° × 20 °.
Because we were only interested in participants’ eye scan
patterns while they were looking at a face, we removed Statistical Analyses
irrelevant features such as hair and clothing from the stim-
ulus faces. The modified pictures consisted of a standard- Accuracy
ized oval shape containing only the face (see Figure 1).
Participants’ emotion recognition accuracy for the two test
sets was assessed using a nonparametric measure of dis-
Eye Tracking criminability (A’). We used the formula suggested by
Snodgrass, Levy-Berger, and Haydon (1985): A’ = 1/2 +
To register eye movements we used a Hi-Speed 1250 track- [(pHit –pFA)*(1 + pHit-pFA)]/[4pHit*(1-pFA)], in which
ing system, SMI-SensoMotoric Instruments (Teltow, Ger- pHit refers to the proportion of hits and pFA to the propor-
many). Only the left eye was recorded, with a sampling rate tion of false alarms. Values range from 0 to 1, with a value
of 1250 Hz and a spatial resolution of 0.01. We only ana- of 0.5 indicating chance performance.
lyzed eye movements that were performed before the re-
sponse to the facial expression was given and only eye data
of correctly recognized emotions were analyzed. We de- Eye Movements
fined regions of interest for four face features: the left eye,
the right eye, the nose, and the mouth. We used two com- Interfeatural Saccade Ratio
plementary eye tracking measures – interfeatural saccade
ratio and feature gaze duration – that have previously been We calculated a ratio comparing the amount of interfeatural
used to identify global and local scan patterns when pro- saccades divided by the total number of saccades per-
cessing faces (Bombari, Mast, & Lobmaier, 2009). Sac- formed (in order to correct for the varying exposure times
cades that were performed between two regions of interest that result from the different reaction times). The higher the
are referred to as interfeatural saccades; the durations of score on this ratio, the more global the information process-
fixations performed within the regions of interest were add- ing style is (i.e., the more information is integrated to form
ed up and are referred to as feature gaze duration. an overall impression).

Task and Procedure Feature Gaze Duration

Participants were tested in two sessions on different days Feature gaze duration refers to the mean time spent looking
(1–2 days apart). On arrival at the first session, participants at the features. Consecutive fixations that were performed
gave their informed consent. At the beginning of the first within the same feature were first summed up, and then the
session, participants were familiarized with the four re- mean was calculated over all features. Fixations that were

Swiss J. Psychol. 70 (4) © 2011 by Verlag Hans Huber, Hogrefe AG, Bern
P. C. Schmid et al.: How Moods Affect Information Processing 227

Table 1
Means (standard deviations) for men’s and women’s emotion recognition sensitivity A’, interfeatural saccade ratio, and
feature gaze duration in happy and sad moods
Mood Happy Sad
Sex Men Women Men Women
Sensitivity A’ .894 (.053) .907 (.049) .872 (.047) .914 (.032)
Interfeatural saccades .339 (.214) .458 (.143) .364 (.190) 417 (.202)
Feature gaze duration 602.32 (444.45) 357.05 (62.21) 621.86 (482.63) 367.88 (68.74)

shorter than 100 ms were not taken into account. Longer Relation Between Information Processing
feature gaze duration indicates more local processing. Styles and Emotion Recognition Accuracy
To test for relations between information processing and
emotion recognition accuracy (Hypothesis 1), we comput- The interfeatural saccade ratio (global processing) correlated
ed correlations between the interfeatural saccade ratio, fea- positively with sensitivity A’ (emotion recognition accuracy)
ture gaze duration, and sensitivity A’. We further calculated when participants were primed with a sad mood, r(29) = .41,
mixed model ANCOVAs with gender as the between-sub- p = .03, but not when they were primed with a happy mood,
jects factor and mood priming (happy vs. sad mood) as the r(29) = .17, p = .39. Longer feature gaze duration (local pro-
within-subjects factor, and with either the interfeatural sac- cessing) was related to decreased Sensitivity A’, r(29) = –.44,
cade ratio (global processing), feature gaze duration (local p = .02, when participants were primed with a sad mood.
processing), or sensitivity A’ (emotion recognition accura- However, no relation was found for participants primed with
cy) as the dependent variable (Hypotheses 2–5). Mood a happy mood, r(29) = .09, p = .66. Note that there were
priming order and test part order were entered as covariates negative correlations between interfeatural saccade ratio and
in the analyses. feature gaze duration for happy mood, r(29) = –.50, p < .01,
To test for differential effects on the four facial expres- and for sad mood, r(29) = –.57, p < .01.
sions, we separately calculated three 2 (Mood Priming:
happy vs. sad) × 2 (Gender) × 4 (Facial Expression: hap-
piness, sadness, anger, fear) mixed model ANCOVAs for Mood Effects
sensitivity (A’), feature gaze duration, and the interfeatur-
al saccade ratio as the dependent variables. Mood priming Mood priming had no effect on sensitivity A’, F(1, 29) =
order and test part order were again entered as covariates 0.99, p = .32, but there was a significant main effect of
in the analyses. Correlations of sensitivity (A’) with mood priming on the interfeatural saccade ratio (global
feature gaze duration and interfeatural saccade ratio processing), F(1, 25) = 6.41, p = .02, indicating that happy
were also calculated separately for each of the four mood priming resulted in more global processing (M = .40)
stimulus emotions for participants in a happy and a sad compared to sad mood priming (M = .39). Mood priming
mood. had no effect on feature gaze duration (local processing),
F(1, 25) = 0.02, p = .89.

Results Gender Effects


The main effect of gender for sensitivity A’ showed that
Table 1 presents the means and standard deviations of emo- women performed better than men (M = .91 for women and
tion recognition accuracy, interfeatural saccade ratio, and M = .88 for men), F(1, 29) = 3.90, p = .05. A trend was
feature gaze duration for men and women separately and found for the interfeatural saccade ratio, F(1, 25) = 3.09, p
for happy and sad mood separately. The overall emotion = .09, showing that women tended to process information
recognition accuracy A’ was .90 (SD = .04). more globally (M = .45) than men (M = .34). Moreover,
gender significantly affected feature gaze duration,
F(1, 25) = 7.99, p < .01; women processed information less
Manipulation Checks locally (M = 320.40) than men (M = 657.20).

One-sample t-tests against 3.5 (neutral mood) showed that


we successfully manipulated participants’ mood. Partici- Mood × Gender Interactions
pants felt happy after happy mood priming (M = 4.35), t(30)
= 8.64, p < .001, and sad after sad mood priming (M = 3.06), We found a significant mood priming by gender interaction
t(32) = 3.83, p = .001. for sensitivity A’, F(1, 29) = 3.85, p = .05. Men recognized

Swiss J. Psychol. 70 (4) © 2011 by Verlag Hans Huber, Hogrefe AG, Bern
228 P. C. Schmid et al.: How Moods Affect Information Processing

emotions better when they were in a happy mood (M = .90) 4). In line with our expectations, women tended to process
than when in a sad mood (M = .87), t(13) = 1.82, p = .09. information more globally (Hypotheses 5a) and processed
Women’s mood did not affect their emotion recognition ac- information significantly less locally than men (Hypothe-
curacy (M = .91 when they were in a happy mood and M ses 5b).
= .91 when in a sad mood), t(18) = .624, p = .54. The in- We showed that a global information processing style is
teractions between interfeatural saccade ratio and feature more favorable for emotion recognition than a local infor-
gaze duration were not significant, Fs < 0.11, ps > .74. mation processing style. However, this link could only be
found when participants were primed with a sad mood, but
not when they were primed with a happy mood. Existing
Facial Expression-Specific Analyses research on the composite effect of emotion recognition has
shown that global face information is difficult to ignore
No significant facial expression-specific mood priming or (Calder et al., 2000). Research on the face inversion effect
gender effects were found on the three dependent variables suggests that hindering global information processing de-
(sensitivity A’, interfeatural saccade ratio, feature gaze du- creases emotion recognition accuracy (Prkachin, 2003).
ration), all Fs < 2.54, all ps > .06. Feature gaze duration Our study further supports the assumption that global in-
correlated negatively with sensitivity A’ for fearful faces formation processing is important for emotion recognition
when participants were in a sad mood, r(29) = –.48, p = by using an eye tracking method. Unlike Calder et al.
.01. Shorter feature gaze duration was, therefore, related to (2000) and Prkachin (2003), we did not manipulate the in-
better recognition accuracy for fearful faces when partici- formation in the faces, but looked at whether participants’
pants were primed with a sad mood. The interfeatural sac- eye scan pattern was related to accuracy. Using upright,
cade ratio was negatively related to sensitivity A’ for happy nonmanipulated faces as stimuli is a strength of our study
faces when participants were in a happy mood, r(29) = as it provides more ecological validity.
–.36, p = .05, meaning that the fewer interfeatural saccades We believe that global processing was beneficial for
participants in a happy mood performed, the better they emotion recognition only when participants were sad and
recognized the happy emotions. None of the other correla- not when they were happy because happy people, in gen-
tions were significant (all ps > .13). eral, engage in more global processing (Gasper & Clore,
2002). Therefore, if our happy-mood-priming primed par-
ticipants to use a global processing style, then we might
Effects of Covariates have produced a ceiling effect, explaining the lack of a cor-
relation between global processing and accuracy when par-
The only significant effects for the covariates were an in- ticipants were happy. Note that we did, indeed, find that
teraction of mood priming by mood priming order for the participants processed more globally after being primed
interfeatural saccade ratio, F(1, 25) = 10.53, p = .003, and with a happy mood compared to after being primed with a
a mood priming order main effect for feature gaze duration, sad mood. It is already well-established that a happy mood
F(1, 25) = 4.62, p = .041. triggers global and automatic and a sad mood local and
analytic processing styles in different domains (Bless et al.,
1996; De Vries et al., 2008; Gasper & Clore, 2002). More-
over, it has often been argued that information processing
Discussion style is responsible for the effects of sad or depressed mood
on emotion recognition and interpersonal judgments in
In this study, we investigated whether mood states, gender, general (Ambady & Gray, 2002; Chepenik et al., 2007; For-
and information processing styles influence emotion rec- gas, 1992). An alternative explanation for the lack of sig-
ognition accuracy. We expected a positive correlation be- nificant findings for the happy mood condition might be
tween global information processing and emotion recogni- that the happy mood manipulation was less strong than the
tion accuracy and a negative correlation between local pro- sad mood manipulation. Some mood induction procedures
cessing and emotion recognition accuracy (Hypotheses 1a are known to produce mood effects that differ in strength;
and 1b). These hypotheses were only confirmed for the sad however, this is usually not the case for the priming method
but not for the happy mood priming condition. As expected, we used, which employs happy and sad mood film scenes
a happy mood resulted in more global information process- (for a review, see Gerrards-Hesse, Spies, & Hesse, 1994;
ing than a sad mood (Hypothesis 2a). However, contrary to or also Gross & Levenson, 1995).
our expectation, a happy mood did not entail less local pro- We failed to show a direct link between mood and emo-
cessing than a sad mood (Hypothesis 2b). We had predicted tion recognition accuracy in the present study. Previous
that participants would recognize emotions more accurate- studies either found mood-congruity effects or decreased
ly when in a happy mood than a sad mood (Hypothesis 3), emotion recognition accuracy for people in a sad mood.
and our results confirmed this hypothesis for men but not What might account for our nonsignificant finding? Be-
for women. As expected, we found that women outper- cause we used a wider range of facial expressions (happy,
formed men on the emotion recognition task (Hypothesis sad, fearful, and angry expressions) than previous studies,

Swiss J. Psychol. 70 (4) © 2011 by Verlag Hans Huber, Hogrefe AG, Bern
P. C. Schmid et al.: How Moods Affect Information Processing 229

we may have washed out specific effects. For example, References


Schmid and Schmid Mast (2010), who found mood-con-
gruity effects, only included sad and happy facial expres-
Ambady, N., & Gray, H. M. (2002). On being sad and mistaken:
sions as stimuli. Chepenik et al. (2007) used a stimulus set
Mood effects on the accuracy of thin-slice judgments. Journal
similar to the one used in the present study and found that of Personality and Social Psychology, 83, 947–961. doi
a sad mood decreased emotion recognition compared to a 10.1037//0022-3514.83.4.947
neutral mood condition. However, the happy mood was not Bless, H., Clore, G. L., Schwarz, N., Golsiano, V., Rabe, C., &
manipulated in the Chepenik et al. study and it is possible Wölk, M. (1996). Mood and the use of scripts: Does happy
that both happy and sad moods have detrimental effects on mood make people really mindless? Journal of Personality
emotion recognition accuracy. and Social Psychology, 63, 585–595. doi 10.1037/0022-
One limitation of the present study is the lack of a con- 3514.71.4.665
trol group in neutral mood; we cannot, therefore, speculate Bombari, D., Mast, F. W., & Lobmaier, J. S. (2009). Featural, con-
whether both happy and sad moods affected performance figural, and holistic face processing strategies evoke different
negatively. But we can speculate why participants did not scan patterns. Perception, 38, 1508–1521. doi 10.1068/p6117
perform better when in a happy mood (when they pro- Bouhuys, A. L., Bloem, G. M., & Groothuis, T. G. G. (1995). In-
duction of depressed and elated mood by music influences the
cessed more globally, as shown by us) than when in a sad
perception of facial emotional expressions in healthy subjects.
mood. Detrimental effects of mood on task performance Journal of Affective Disorders, 33, 215–226. doi 10.1016/
have often been explained with the cognitive load theory 0165-0327(94)00092-N
(Sweller, 1988): The experience of emotions (positive and Bouhuys, A. L., Geerts, E., & Gordijn, M. C. M. (1999). De-
negative) might impose cognitive load and affect working pressed patients’ perception of facial emotions in depressed
memory (Um, Song, & Plass, 2007). For example, Weg- and remitted states are associated with relapse: A longitudinal
ner, Erber, and Zanakos (1993) showed that regulation of study. The Journal of Nervous and Mental Disease, 187,
mood states after positive and negative mood priming re- 595–602. doi 00005053-199910000-00002
quires cognitive resources that cannot be invested in other Brosnan, M. J., Scott, F. J., Fox, S., & Pye, J. (2004). Gestalt pro-
tasks. Cognitive load is known to decrease emotion rec- cessing in autism: Failure to process perceptual relationships and
ognition accuracy (Phillips, Channon, Tunstall, Heden- the implications for contextual understanding. Journal of Child
Psychology and Psychiatry, 45, 459–469. doi 10.1111/j.1469-
strom, & Lyons, 2008). In our experiment, happy partici-
7610.2004.00237.x
pants performed more global eye movements and, there-
Byron, K., Terranova, S., & Nowicki, S. J. (2007). Nonverbal
fore, gained more reliable information than participants in emotion recognition and salespersons: Linking ability to per-
a sad mood, but if cognitive load was high (because par- ceived and actual success. Journal of Applied Social Psychol-
ticipants were regulating their mood), perhaps they were ogy, 37, 2600–2619. doi 10.1111/j.1559-1816.2007.00272.x
not able to take advantage of this information. Calder, A. J., Young, A. W., Keane, J., & Dean, M. (2000). Con-
In addition to the mood effects on information process- figural information in facial expression perception. Journal of
ing, we predicted that gender would affect information Experimental Psychology: Human Perception and Perfor-
processing. In line with the literature (McClure, 2000), mance, 26, 527–551. doi 10.1037/0096-1523.26.2.527
women outperformed men in our task. Previous research Carton, J. S., Kessler, E. A., & Pape, C. L. (1999). Nonverbal decod-
has shown that men and women differ in brain activation ing skills and relationship well-being in adults. Journal of Non-
verbal Behavior, 23, 91–100. doi 10.1023/A:1021339410262
during emotion recognition, suggesting that they use dif-
Chepenik, L. G., Cornew, L. A., & Farah, M. J. (2007). The influ-
ferent processing styles (Hall et al., 2003). However, no
ence of sad mood on cognition. Emotion, 7, 802–811. doi
study has yet examined whether men and women differ in 10.1037/1528-3542.7.4.802
the way they look at the stimulus. We showed that women Clore, G. L., Wyer, R. S., Dienes, B., Gasper, K., Gohm, C., &
look at faces in a different way using a more global pattern Isbell, L. (2001). Affective feelings as feedback: Some cogni-
of scanning. This suggests that gender differences in in- tive consequences. In L. L. Martin & G. L. Clore (Eds.), The-
formation processing not only appear in brain data, but ories of mood and cognition: A user’s guide (pp. 27–62). Mah-
also on a behavioral level when participants are scanning wah, NJ: Erlbaum.
the faces. Damasio, A. R. (1994). Descartes error: Emotion, reason, and the
The present study helps us to better understand the human brain. New York: Harper Collins.
mechanisms underlying accurate emotion recognition. De Vries, M., Holland, R. W., & Witteman, C. L. (2008). Fitting
decisions: Mood and intuitive versus deliberative decision
The ability to recognize other people’s emotions is impor-
strategies. Cognition and Emotion, 22, 931–943. doi 10.1080/
tant for personal and professional success and well-being 02699930701552580
(Byron, Terranova, & Nowicki, 2007; Carton, Kessler, & Ekman, P., & Friesen, W. V. (1976). Pictures of facial affect. Palo
Pape, 1999). Thus, the current findings have important ap- Alto, CA: Consulting Psychologists Press.
plied implications. Understanding the mechanism and Forgas, J. P. (1992). On mood and peculiar people: Affect and
moderating factors of correct emotion recognition can person typicality in impression formation. Journal of Person-
help us design effective training techniques aimed at im- ality and Social Psychology, 62, 863–875. doi 10.1037/0022-
proving emotion recognition skills. 3514.62.5.863

Swiss J. Psychol. 70 (4) © 2011 by Verlag Hans Huber, Hogrefe AG, Bern
230 P. C. Schmid et al.: How Moods Affect Information Processing

Förster, J., & Dannenberg, L. (2010). GLOMOsys: A systems ac- Niedenthal, P. M., Halberstadt, J. B., Margolin, J., & Innes-Ker,
count of global versus local processing. Psychological Inquiry, A. H. (2000). Emotional state and the detection of change in facial
21, 175–197. doi 10.1080/1047840X.2010.487849 expression of emotion. European Journal of Social Psychology,
Freire, A., Lee, K., & Symons, L. A. (2000). The face-inversion 30, 211–222. doi 10.1002/(SICI)1099-0992(200003/04)30:2
effect as a deficit in the encoding of configural information: <211::AID-EJSP988>3.0.CO;2-3
Direct evidence. Perception, 29, 159–170. doi 10.1068/p3012 Nowicki, S. J., & Duke, M. P. (1994). Individual differences in the
Gasper, K., & Clore, G. L. (2002). Attending to the big picture: nonverbal communication of affect: The Diagnostic Analysis
Mood and global versus local processing of visual information. of Nonverbal Accuracy Scale. Journal of Nonverbal Behavior,
Psychological Science, 13, 34–40. doi 10.1111/1467-9280. 18, 9–35. doi 10.1007/BF02169077
00406 Phillips, L. H., Channon, S., Tunstall, M., Hedenstrom, A., & Ly-
Gerrards-Hesse, A., Spies, K., & Hesse, F. W. (1994). Experimen- ons, K. (2008). The role of working memory in decoding emo-
tal inductions of emotional states and their effectiveness: A tions. Emotion, 8, 184–191. doi 10.1037/1528-3542.8.2.184
review. British Journal of Psychology, 85, 55–78. doi Prkachin, G. C. (2003). The effect of orientation on detection and
10.1111/j.2044-8295.1994.tb02508.x identification of facial expressions of emotion. British Journal
Gilligan, C. (1982). In a different voice. Cambridge, MA: Harvard of Psychology, 94, 45–62. doi 10.1348/000712603762842093
University Press. Putrevu, S. (2001). Exploring the origins and information pro-
Gross, J. J., & Levenson, R. W. (1995). Emotion elicitation using cessing differences between men and women: Implications for
films. Cognition and Emotion, 9, 87–108. doi 10.1037/0021- advertisers. Journal of Marketing Management, 10, 47–66.
843X.106.1.95 Rottenberg, J., Ray, R. R., & Gross, J. J. (2007). Emotion elicita-
Hall, G. B. C., Witelson, S. F., Szechtman, H., & Nahmias, C. tion using films. In J. A. Coan & J. J. B. Allen (Eds.), The hand-
(2003). Sex differences in functional activation patterns re- book of emotion elicitation and assessment. New York: Oxford
vealed by increased emotion processing demands. Brain Imag- University Press.
ing, 15, 219–223. doi 10.1097/01.wnr.0000101310.64109.94 Schaefer, A., Nils, F., Sanchez, X., & Phillippot, P. (2010). As-
Heimberg, C., Gur, R. E., Erwin, R. J., Shtasel, D. L., & Gur, R. C. sessing the effectiveness of a large database of emotion-elicit-
(1992). Facial emotion discrimination: III. Behavioral findings ing films: A new tool for emotion researchers. Cognition and
in schizophrenia. Psychiatry Research, 42, 253–265. doi Emotion, 24, 1153–1172. doi 10.1080/02699930903274322
10.1016/0165-1781(92)90117-L Schmid, P. C., & Schmid Mast, M. (2010). Mood effects on emo-
Herba, C., & Phillips, M. (2004). Annotation: Development of facial tion recognition. Motivation and Emotion, 34, 288–292. doi
expression recognition from childhood to adolescence: Behav- 10.1007/s11031-010-9170-0
ioural and neurological perspectives. Journal of Child Psycholo- Schwarz, N. (1990). Feelings as information: Informational and
gy and Psychiatry, 45, 1185–1198. doi 10.1111/j.1469- motivational functions of affective states. In E. T. Higgins &
7610.2004.00316.x R. Sorrentino (Eds.), Handbook of motivation and cognition:
Keltner, D., & Kring, A. (1998). Emotion, social function, and Foundations of social behavior (Vol. 2, pp. 527–561). New
psychopathology. General Psychological Review, 2, 320–342. York: Guilford.
doi 10.1037/1089-2680.2.3.320 Searcy, J. H., & Bartlett, J. C. (1996). Inversion and processing of
Kimchi, R., & Palmer, S. E. (1982). Form and texture in hierar- component and spatial-relational information in faces. Journal
chically constructed patterns. Journal of Experimental Psy- of Experimental Psychology: Human Perception and Perfor-
chology: Human Perception and Performance, 8, 521–535. mance, 22, 904–915. doi 10.1037/0096-1523.22.4.904
doi 10.1037/0096-1523.8.4.521 Snodgrass, J. G., Levy-Berger, G., & Haydon, M. (1985). Human
Ledoux, J. E. (2000). Emotion circuits in the brain. Annual Review experimental psychology. New York: Oxford University Press.
of Neuroscience, 23, 155–184. doi 10.1146/annurev.neuro. Surguladze, S. A., Young, A. W., Senior, C., Brébion, G., Travis,
23.1.155 M. J., & Phillips, M. L. (2004). Recognition accuracy and re-
Lembke, A., & Ketter, T. A. (2002). Impaired recognition of facial sponse bias to happy and sad facial expressions in patients with
emotion in mania. American Journal of Psychiatry, 159, major depression. Neuropsychology, 18, 212–218. doi
302–304. doi 10.1176/appi.ajp. 159.2.302 10.1037/0894-4105.18.2.212
Leppänen, J. M., & Hietanen, J. (2004). Positive facial expressions Sweller, J. (1988). Cognitive load during problem solving: Effects
are recognized faster than negative facial expressions, but why? on learning. Cognitive Science, 12, 257–285. doi 10.1207/
Psychological Research, 69, 22–29. doi 10.1007/s00426-003- s15516709cog1202_4
0157-2 Um, E. R., Song, H., & Plass, J. (2007). The effect of positive
McClure, E. B. (2000). A meta-analytic review of sex differences emotions on multimedia learning. In C. Montgomerie & J.
in facial expression processing and their development in in- Seale (Eds.), Proceedings of World Conference on Educational
fants, children, and adolescents. Psychological Bulletin, 126, Multimedia, Hypermedia and Telecommunications 2007
424–453. doi 10.1037//0033-2909.126.3.424 (pp. 4176–4185). Chesapeake, VA: AACE. Retrieved from
McKelvie, S. J. (1995). Emotional expression in upside-down fac- http://www.editlib.org/p/25979
es: Evidence for configural and componential processing. Brit- Wegner, D. M., Erber, R., & Zanakos, S. (1993). Ironic processes
ish Journal of Social Psychology, 34, 325–334. doi in the mental control of mood and mood-related thought. Jour-
10.1111/j.2044-8309.1995.tb01067.x nal of Personality and Social Psychology, 65, 1093–1104. doi
Moore, D. G. (2001). Reassessing emotion recognition performance 10.1037/0022-3514.65.6.1093
in people with mental retardation: A review. American Journal Westermann, R., Spies, K., Stahl, G., & Hesse, F. W. (1996). Rel-
of Mental Retardation, 106, 481–502. doi 10.1352/0895- ative effectiveness and validity of mood induction procedures:
8017(2001)106 A meta-analysis. European Journal of Social Psychology, 26,

Swiss J. Psychol. 70 (4) © 2011 by Verlag Hans Huber, Hogrefe AG, Bern
P. C. Schmid et al.: How Moods Affect Information Processing 231

557–580. doi 10.1002/(SICI)1099-0992(199607)26:4%3C Petra Claudia Schmid


557::AID-EJSP769%3E3.0.CO;2-4
Yin, R. K. (1969). Looking at upside-down faces. Journal of Ex- Department of Work and Organizational Psychology
perimental Psychology, 81, 141–145. doi 10.1037/h0027474 University of Neuchatel
Zuroff, D. C., & Colussy, S. A. (1986). Emotional recognition in Rue Emile-Argand 11
schizophrenia and depressed inpatients. Journal of Clinical Psy- CH - 2000 Neuchatel
chology, 42, 411–416. doi 10.1002/1097-4679(198605)42:3 Switzerland
<411::AID-JCLP2270420302>3.0.CO;2-T petra.schmid@unine.ch

Swiss J. Psychol. 70 (4) © 2011 by Verlag Hans Huber, Hogrefe AG, Bern

You might also like