You are on page 1of 12

c o r t e x 6 0 ( 2 0 1 4 ) 8 2 e9 3

Available online at www.sciencedirect.com

ScienceDirect
Journal homepage: www.elsevier.com/locate/cortex

Special issue: Research report

How functional coupling between the auditory


cortex and the amygdala induces musical emotion:
A single case study

Catherine Liegeois-Chauvel a,b,*, Christian Benar a,b, Julien Krieg a,b,


Charles Delbe c, Patrick Chauvel a,b,d, Bernard Giusiano a,b,d and
Emmanuel Bigand c
a
INS INSERM, UMR U, 1106 Marseilles, France
b
Aix-Marseille Universite, 13005 Marseilles, France
c
LEAD UMR 5022 CNRS, Universite de Bourgogne, 21065 Dijon, France
d
H^opitaux de Marseille, H^opital de la Timone, 13005 Marseille, France

article info abstract

Article history: Music is a sound structure of remarkable acoustical and temporal complexity. Although it
Received 30 May 2013 cannot denote specific meaning, it is one of the most potent and universal stimuli for
Reviewed 8 December 2013 inducing mood. How the auditory and limbic systems interact, and whether this interac-
Revised 3 June 2014 tion is lateralized when feeling emotions related to music, remains unclear. We studied the
Accepted 4 June 2014 functional correlation between the auditory cortex (AC) and amygdala (AMY) through
Published online 16 June 2014 intracerebral recordings from both hemispheres in a single patient while she listened
attentively to musical excerpts, which we compared to passive listening of a sequence of
Keywords: pure tones. While the left primary and secondary auditory cortices (PAC and SAC) showed
Human larger increases in gamma-band responses than the right side, only the right side showed
Auditory cortex emotion-modulated gamma oscillatory activity. An intra- and inter-hemisphere correla-
Amygdala tion was observed between the auditory areas and AMY during the delivery of a sequence
Emotion of pure tones. In contrast, a strikingly right-lateralized functional network between the AC
Music and the AMY was observed to be related to the musical excerpts the patient experienced as
happy, sad and peaceful. Interestingly, excerpts experienced as angry, which the patient
disliked, were associated with widespread de-correlation between all the structures. These
results suggest that the right auditoryelimbic interactions result from the formation of
oscillatory networks that bind the activities of the network nodes into coherence patterns,
resulting in the emergence of a feeling.
© 2014 Elsevier Ltd. All rights reserved.

* Corresponding author. INSERM, UMR U 1106, Faculte de Me


decine, 27, Bd Jean Moulin, 13005 Marseille, France.
geois-Chauvel).
E-mail address: catherine.liegeois-chauvel@univ-amu.fr (C. Lie
http://dx.doi.org/10.1016/j.cortex.2014.06.002
0010-9452/© 2014 Elsevier Ltd. All rights reserved.
c o r t e x 6 0 ( 2 0 1 4 ) 8 2 e9 3 83

reward/motivation situations. These regions include limbic


1. Introduction [amygdala (AMY)] and paralimbic structures (orbito frontal
cortex, parahippocampal gyrus), as well as the medio-
Music is a sound structure of remarkable acoustical and
ventral prefrontal cortex, anterior cingulate cortex and the
temporal complexity and although it cannot denote specific
ventral striatum (Blood & Zatorre, 2001; Levitin & Menon,
meaning, it is one of the most potent and universal stimuli
2003; Menon & Levitin, 2005; Gosselin et al., 2006; Pereira
for inducing mood, evoking comparable emotional re-
et al., 2011; Schmithorst & Holland, 2003). Strong AMY acti-
sponses across different musical categories and cultures
vation is reported for negative emotions (Koelsch, Frotz, von
(Peretz & Hebert, 2000; Fritz et al., 2009). The induced affec-
Cramon, Muller & Friederici, 2006; Mitterschiffthaler, Fu,
tive states can be strong and long-lasting (Krumhansl, 1997)
Dalton, Andrew, & Williams, 2007) and for both positive
and very short stimuli may be sufficient to trigger emotional
and negative stimuli (Eldar, Ganor, Admon, Bleich, &
responses (up to 250 msec, Bigand, Filipic, & Lalitte, 2005).
Hendler, 2007). Yet, the modulation of AMY activity is
Understanding the underlying mechanism of how emotion
more likely to be observed during a subject's emotional ex-
is induced by listening to music remains a matter of study
periences, or when emotional music is combined with either
tracking down the underlying mechanisms. Juslin and col-
movies or affective pictures, suggesting that the involve-
laborators (Juslin, 2013; Juslin & Va € stfja
€ ll, 2008) have pro-
ment of this region depends on the context of the subjective
posed a comprehensive framework taking into account
emotional experience associated with the stimuli (Blood &
several physiological mechanisms underlying emotions
Zatorre, 1999; Eldar et al., 2007; Baumgartner, Esslen, &
induced by music, the BRECVEMA model (Brainstem re- € ncke, 2006).
Ja
flexes-Rythmic entrainment, Evaluative conditioning-
Up to now, few studies have investigated the neural
emotional Contagion, Visual imagery-Episodic memory-
mechanisms underlying how acoustical features of stimuli
Musical expectancy, Aesthetic judgment). In addition, the
modulate emotional feeling by looking at the functional
phenomenon of entrainment should be mentioned as an
coupling between the AC and AMY. Recently, Kumar,
important candidate for emotion induction through music,
Stephan, Warren, Friston & Griffiths (2012) showed that
with recent empirical evidence showing a strong link be-
aversive sounds are first processed in the AC, which can
tween the feeling of entrainment and emotion (Labbe  &
foresee the assignment of valence in the AMY. The AMY, in
Grandjean, in press). From another angle, it is largely
turn, can modulate the AC in accordance with the valence of
acknowledged that the acoustic feature content in musical
the sounds. The AMY has extensive connections with many
excerpts covaries with its emotional valence. Converging
cortical and subcortical areas, including back projections to
evidence shows that musical features such as the tempo
the AC, that may modulate sensory processing in these areas
(speed of the beat in music) and the mode (major or minor)
based on emotional signals (Amaral & Price, 1985; Price, 2003;
are relevant in evoking a sense of happiness or sadness
Hoistad & Barbas, 2008). This is in line with studies showing
(Dalla Bella et al., 2001; Khalfa, Scho € n, Anton & Lie geois-
the increase of AC activation in response to pleasant and/or
Chauvel, 2005) and are associated with specific brain signa-
unpleasant sounds compared to neutral sounds, which
tures (Trochidis & Bigand, 2013). Numerous studies have
would suggest this modulation could be primed by the AMY
contributed to the identification of brain areas in the right
through re-entrant projections (Grandjean et al., 2005; Plichta
and left hemispheres that seem likely to support some stage
et al., 2011).
of the perceptive or emotional processing of music (for re-
Emotion induced by music is a dynamic process; therefore,
views see, Peretz & Zatorre, 2005; Stewart, Von Kriegsten,
its evolution over time in relation to listening is more precisely
Warren & Griffiths, 2006).
tracked by electrophysiological recordings than functional
With respect to auditory processing, it has been argued
imagery. Simultaneous and bilateral depth electrode re-
that auditory cortices in the two hemispheres are relatively
cordings from both AC and AMY in a single epileptic patient
specialized: the right auditory cortex (AC) underlies fine-
during presurgical evaluation, allowed us to study the neural
grained pitch (frequency) processing, whereas the left AC
activity underlying this process with great spatio-temporal
processes rapidly changing broadband stimuli (Lie geois-
precision. We investigated the functional coupling between
Chauvel, de Graaf, Laguitton, & Chauvel, 1999; Lie geois-
these structures by applying nonlinear correlations of intra-
Chauvel, Giraud, Badier, Marquis, & Chauvel, 2001; Warrier &
cerebral EEG while the patient listened either to musical ex-
Zatorre, 2004; Zatorre & Belin, 2001; Zatorre, Belin, &
cerpts conveying different emotions (such as happiness,
Penhune, 2002). In addition, brain imaging, electrophysiolog-
sadness, peacefulness & anger) or to a sequence of pure tones
ical and neuropsychological studies have established that
(neutral stimuli). Although emotion evoked by music may be
secondary AC (particularly in the right hemisphere) is asso-
analyzed in numerous ways (see Zentner, Grandjean, &
ciated with the processing of contour properties of unfamiliar
Scherer, 2008 for a full account), in the present study, we
melodies (Behne, Scheich, & Brechmann, 2005; Brechmann &
used a more convenient and traditional way to evaluate the
Scheich, 2005; Brattico, Tervaniemi, Na €a€ ta
€ nen, & Peretz,
feelings experienced by the patient, which was more appro-
2006). In contrast, primary AC (PAC) only processes features
priate for the clinical setting.
of isolated sounds (Samson & Zatorre, 1988; Johnsrude,
The rationale behind this approach was to examine
Penhune, & Zatorre, 2000; Patterson, Uppenkamp, Johnsrude
whether the ACeAMY coupling: (i) is influenced by the
&Griffith, 2002).
emotional valence judgments or by the complexity of the
Music-evoked emotional activations involve brain regions
stimuli (music vs pure tones), and (ii) is modulated as a
known to also be activated by emotional pictures and
function of hemisphere.
84 c o r t e x 6 0 ( 2 0 1 4 ) 8 2 e9 3

follow-up behavioral study was conducted later in order to


2. Experimental procedures assess the patient's own emotional classification between the
given four emotions and her feelings (Liking or disliking) for
2.1. Subject
each excerpt using a scale from 1 to 10. We have chosen this
strategy because explicit instructions to monitor and report
CV, a 45-year-old French-speaking woman with a draughts-
feelings could interfere with the emotional reactions one is
manship's degree (10 years of education), participated in this
attempting to assess.
study. She suffered from drug-resistant partial epilepsy and
The stimuli were presented in pseudo-randomized order
was implanted for pre-surgical investigation with chronic
using E-prime 1.1 (Psychology Software Tools Inc. Pittsburgh,
depth electrodes (Stereotactic EEG, SEEG) in the right and left
PA, USA), through two free field Yamaha NS10M loudspeakers.
AC [i.e., Heschl's gyrus (primary and secondary AC (PAC and
The melodies were delivered during 20 sec with an ISI of 12 s.
SAC) and BA 22)], as well as in the right and left AMYe (Fig. 1,
Data acquisition started 3 min before the presentation of the
see Supplemental Material for details about the anatomical
first melody in order to have an EEG resting-reference period.
reconstruction of electrodes). These structures were not
involved in the epileptogenic network.
The choice of the anatomical location of electrodes was 2.4. Electrophysiological recordings
based on clinical and video-EEG recordings, as well as MRI,
and was made independently of the present study. This pa- The recordings of intracerebral EEG were monopolar, with
tient provided informed consent to the protocol, approved by each contact of a given depth electrode referenced to an extra-
the Institutional Review Board of the French Institute of dural lead using acquisition software and a 256-channel
Health. A standard Oldfield test indicated that she was BrainAmps EEG amplification system (Brain Products GmbH,
ambidextrous. She displayed a left hemisphere specialization Munich, Germany). During the acquisition the EEG signal was
for language and normal memory scores (Wechsler Memory amplified with a digital band-pass filter (.5e200 Hz) and digi-
scale III, QM ¼ 100). Recordings of brainstem evoked potentials tized at a rate of 1 KHz per channel (resolution: 1 msec per
and pure-tone audiograms carried out before SEEG indicated sample).
intact cochlear and brainstem auditory functions.

2.5. Data analysis


2.2. Stimuli
2.5.1. Time frequency analysis
2.2.1. -Pure tones In order to analyze the reactivity of cortical structures to pure
Auditory evoked potential recordings are part of the func- tones and to music in relation to emotion, we performed time-
tional mapping procedure that aims to characterize different frequency analysis on data in a bipolar montage using Matlab
geois-Chauvel, Musolino, & Chauvel, 1991;
auditory areas (Lie software (MathWorks, Natick, MA). A wavelet transform
 geois-Chauvel, Musolino, Badier, Marquis, & Chauvel,
Lie (Gabor, oscillation parameter x ¼ 7) was performed on the
1994); they were recorded before the musical procedure in whole temporal window (30 sec), from 10 sec before the onset
this study. 420 tone bursts of four tonal frequencies (500 Hz, of the melody through the 20 sec duration of the excerpt, for
1 Khz, 2 Khz, 3 Khz) were equally and randomly delivered at a each trial. Then we averaged the time-frequency energy maps
rate between 800 msec and 1.2 sec. Each tone burst had a across trials. We performed statistical quantification of the
0.3 msec rise and decay time and a total duration of 30 msec. changes between stimulus and baseline periods, or between
stimulus periods across emotions. We used a non-parametric
2.2.2. Musical excerpts ManneWhitney Wilcoxon test, with a threshold of signifi-
Forty musical excerpts of 20 sec duration were selected by cance set to .05. In the case of pure tones, we used the same
music theorists and psychologists in order to elicit a homo- pool of data (20 trials of 10 sec of silence) for the baseline
geneous emotional experience. The excerpts (some examples (resting state).
are available online: http://leadserv.u-bourgogne.fr/wave/) To compare the four emotional categories, we used two
pertain to the four main musical periods of Western classical models: ‘power ~ cortical area * emotion * frequency band’,
music (i.e., baroque, classic, romantic and modern) and and for each frequency band ‘power ~ cortical area * emotion’
involve varied instrumental groups (solo, chamber music, (cortical areas recorded from 14 contacts). The power data
orchestra). Most of them were shown to involve clear emotion, have been normalized by a logarithmic and arc-tangent
belonging to the four main categories: happy, sad, angry or transformation. Each group of post hoc contrasts has been
peaceful (Bigand et al. 2005). Furthermore, their emotional corrected by the Holm-Bonferroni method. We kept only the
qualities have been confirmed by behavioral measures in most significant differences to account for the overall
numerous studies, such as Khalfa et al. (2008a, b) or Filipic, correction of these multiple comparisons.
Tillman & Bigand (2010).
2.5.2. Estimation of nonlinear correlations between SEEG
2.3. Experimental procedure signals
The functional couplings between SEEG signals recorded from
During the recording session, the patient, comfortably seated distinct cortical structures, auditory areas and AMY were
in an armchair, was instructed to listen and to concentrate on estimated using nonlinear regression analysis (Lopes da Silva,
her own feelings without reporting her emotional response. A Pijn, & Boeijinga, 1989; Wendling, Bartolomei, Bellanger, &
c o r t e x 6 0 ( 2 0 1 4 ) 8 2 e9 3 85

Fig. 1 e Anatomical reconstruction of the intracerebral electrodes localized in the right amygdale (A) and AC (H and T) and
homologous regions in the left hemisphere (A′, H′ and T′): The first three leads of A and A′ recorded activity from AMY
electrodes; H and H′ recorded activity from the primary AC (PAC) and the secondary AC (SAC) according to the following leads: H
1,2,3 and H’ 5,6,7 recorded activity from the right and left PAC, respectively; the leads H 4 to H9 and H′ 8 to H′ 13 were in right and
left SAC; and T and T’ recorded activity from the associative auditory areas, BA 22: T1 to T4 and T′10 to T′ 14.

Chauvel, 2001; Wendling, Bartolomei, Bellanger, Bourien, & were preferred to monopolar signals. h2 values were over a
Chauvel, 2003) 5 sec sliding window (1 sec step) for all possible pairs in both
Nonlinear regression analysis is aimed at quantifying the hemispheres (200 sec of recordings in each auditory excerpt).
relationship (or association) between two SEEG signals X(t) and We computed a diagram of connectivity differences between
Y(t) recorded from two neuronal populations Px and Py, without the mean h2 coefficient from both conditions for all combi-
making any assumptions regarding the nature (linear or nations of cortical structures. Both h2 (X- > Y) and h2 (Y- > X)
nonlinear) of this relationship (for details, see Wendling et al., values have associated time delays t (X- > Y) and t (Y- > X)
2001). For the sake of simplicity, the nonlinear correlation co- between both signals of interest. By combining information
efficient h2* xy will be simply denoted as h2. Here, we consider from the h2 and t measures, Dxy is expressed as:
the maximum value between h2 (X- > Y) and h2 (Y- > X).   
Dxy ¼ 1=2 sgn h2 ðX/YÞ  h2 ðY/XÞ þ sgnðtðX/YÞ  tðY/XÞÞ
In order to limit the influence of the common reference,
bipolar signals (obtained from the subtraction of monopolar Dxy is comprised between 1 and 1, with positive Dxy
signals recorded on two adjacent contacts in the structure) indicating that X(t) influences Y(t). To compare directionality
86 c o r t e x 6 0 ( 2 0 1 4 ) 8 2 e9 3

across conditions, we have determined the absolute value of as the variable. We found that power is significantly different
the difference of directionality between conditions. Signifi- according to the emotion [F (3, 4123) ¼ 17,5; MSE ¼ 2,19,
cant differences of directionality were plotted on the diagram p < .0001], to the cortical area [F (13,4123) ¼ 20,98; MSE ¼ 2,62,
of h2 connectivity with arrows. The strength of the difference p < .0001], and to the frequency band [F (3,4123) ¼ 88,04;
was shown as a percentage of the distance between nodes in MSE ¼ 11,04, p < .0001] with a significant interaction between
the diagram (e.g., Dxy ¼ .1, i.e., Dxy will be presented as 10% of emotion X cortical areas [F (39,4123) ¼ 1,56, p < .01], emotion X
the distance between X and Y on the diagram, starting from X). frequency bands [F (9,4123) ¼ 4,6, p < .0001] and cortical leads X
In the case of directionality reversal between conditions, a frequency bands [F (30,4123) ¼ 15,17, p < .0001]. Fig. 4 (top) il-
couple of arrows were plotted starting from the middle of the lustrates the interaction between cortical areas and emotion.
distance between nodes, representing the direction and Angry excerpts stand out from all other emotions in the left
strength of the relative condition. and right AMYe (p < .01 and p < .001 respectively) and in the
To determine the statistical threshold in both calculations right SAC and BA 22 (p < .05). In addition, the arousal (contrast
(difference in h2 and Dxy) between the two conditions, we used between angry-happy vs sad-peaceful) is significantly modu-
a permutation test (1000 permutations) based on the non- lated in the right AMY (p < .0001), the right SAC and BA 22
parametric Wilcoxon rank sum test, with a Monte Carlo p- (p < .05).
value p ¼ .01, which was double side corrected for multiple The contrasts between all the cortical areas and the fre-
comparisons using the maximum cluster statistics described quency bands show that the main distinction between the
by Maris & Oostenveld, 2007. emotions is found in the beta and gamma bands. An increase
in gamma band power and a decrease in beta band power
occurred more prominent in the left hemisphere, regardless
3. Results of cortical area, than in the right hemisphere (PAC and SAC,
p < .001, and p < .05 for BA 22). Post hoc analyses showed that
CV is a non-musician but she used to listen to music, prefer- a significant difference between the emotions was observed
ring variety rather than classical. She was not familiar with in the right AC. No significant difference in gamma band was
the excerpts. She classified 37 out of 40 excerpts into the four found in the left AC regardless of the emotion conveyed by
categories as performed by healthy subjects in the Bigand the musical excerpt (Fig. 4, bottom). Note that the left AMY is
et al.'s study (2005). Excerpts she liked had a mean rating of 8 removed from this latter analysis because of the recurrence
for happy, 7.3 for sad, or 6.5 for peaceful, respectively. The of epileptic spikes observed in the musical period, as well as
“angry” excerpts were unpleasant for the patient as she only in the silent period, interfering with the gamma band
gave them a mean rating of 3.2. She classified these latter analysis.
excerpts in the right category, although she did not recognize A psychoacoustical model was used to test whether some
them as angry rather, she described them as aggressive and relevant features of the auditory signal may have been related
that most of them evoked war for her. to SEEG signal. The basic psychoacoustic features of music
chosen for analysis were the level of dBA (loudness), zero-
3.1. Brain activation patterns crossing rate (ZCR), spectral centroid, roughness, pitch peri-
odicity flux (PP flux), and spectral roll-off. These features were
Cortical activation patterns can be observed on averaged computed from the output of an auditory model by Leman
time-frequency energy maps computed for musical excerpts (2000). Although these features captured some of the behav-
as a function of emotion, and for pure tones. A significant ioral measures of arousal and valence in music (Lalitte &
difference in frequency band activity from the different areas Bigand, 2006), no one significant contribution of these fea-
of AC in both hemispheres is observed between pure tones tures were found with SEEG signal (p > .1). In addition, in order
and music perception (Figs. 2 and 3). to ascertain that the modulation is due to the emotion per se
The sequence of pure tones elicited a rhythmic theta ac- and not to the different acoustical properties of each category
tivity associated with an increase in low gamma band activity of musical emotion, we estimated the general rhythmic
(30e60 Hz), reflecting sharp synchronous evoked activity pattern of both musical pieces and SEEG signals, calculating
rather than oscillatory processes (Fig. 2). This theta activation the periodicities of these signals by means of the autocorre-
was more reliably recorded from the right PAC (p < .0001) than lation function (see Fig. 2 and text in supplemental material).
from the left PAC (p < .01) and was no longer observed either in A 4 (emotion categories) 4 (areas) 2 (hemispheres) RM-
the SAC, BA 22 or AMY. ANOVA, with pieces as the random variable, was conducted
Listening to music entailed a significant enhancement in on these correlation values. This analysis revealed a strong
the gamma band (40e100 Hz) compared to the silent periods main effect of hemisphere [F (1, 36) ¼ 24.63; MSE ¼ .1417;
(p < .001) in PAC, SAC and BA 22, with a greater increase in p < .0005], indicating that the SEEG signals of the right hemi-
gamma (60e80 Hz) on the left side compared to the right side, sphere were more related to the rhythmic pattern of the
regardless of auditory areas or emotions. A marked decrease musical signal than the left. The areas also had an effect on
of energy compared to the baseline below 20 Hz was observed the correlation values [F (3, 108) ¼ 7.2146; MSE ¼ .0182;
in lower frequency bands. In BA 22, the oscillatory pattern is p < .0005]. More importantly, there was a main effect of the
less visible, but the gamma activity was higher in the left than emotional category of the pieces [F (3, 36) ¼ 5.28; MSE ¼ .0750;
right hemisphere (Fig. 3). p < .005], with sad pieces being associated with a very low
We conducted a 14 (contacts) 4 (emotion category) 4 correlation value (p > .10) compared to the other three
(frequency bands) ANOVA with the power (energy frequency) emotions.
c o r t e x 6 0 ( 2 0 1 4 ) 8 2 e9 3 87

Fig. 2 e Mean Time-Frequency plans (10 trials, 0e10 sec) during pure tone sequences. The pure tone bursts lasted 50 msec
(5 msec rise/fall time) and were delivered monaurally at four frequencies (.5, 1, 3, and 4 kHz) with an intensity set at 65 dB HL
at rate of one per second. The color coded energy from ¡5 mV blue to 5 mV red. The plans have been normalized by periods
of silence (10 trials 0e10 sec) during the resting state. The evoked activity (increase in theta-alpha and gamma power) is
constrained in PAC and SAC, whereas BA 22 and AMYG do not show a specific response to pure tones. Color Scale: ±5 mvlt.
88 c o r t e x 6 0 ( 2 0 1 4 ) 8 2 e9 3

Fig. 4 e Graph of interactions between emotions and


regions Top: a two way ANOVA (energy ¼ region *
emotion) shows significant differences between emotions
in the regions: L. AMY [F (3,4315) ¼ 4.21, p < .05], L. BA 22; [F
(3,4315) ¼ 3.73, p < .05), R. AMY; [F (3,4315) ¼ 6.21, p < .01],
R. SAC; [F (3,4315) ¼ 4.13, p < .05] and R. BA 22; [F
(3,4315) ¼ 6.32, p < .01]. Bottom: the same model of ANOVA
on data for gamma frequency band shows significant
differences between emotions in the regions: L. PAC [F
(3,986) ¼ 5.12, p < .01], R. PAC; [F (3,986) ¼ 6.16, p < .01] and
R. SAC; [F (3,986) ¼ 8.44, p < .001]. In purple, Tukey post-hoc
tests after a one way ANOVA (energy ¼ region) show
significant differences between sides for L. PAC versus R.
PAC (p < .001), L. SAC versus R. SAC (p < .001) and L. BA 22
versus R. BA 22 (p < .001).

Passive listening to pure tone sequences, as opposed to


attentive listening to music, revealed a striking difference in
Fig. 3 e Mean Time-Frequency plans (10 trials, ¡12 to
the pattern of non-linear correlations between the auditory
20 sec) during music listening (H:happy, S : sad, P: peaceful
areas and AMY. Pure tones elicit a significant widespread in-
and A: angry excerpts. The evoked and induced activities
crease of intra- and inter-hemispheric correlations between
spread in all PAC, SAC, BA22 and AMG. In PAC and SAC, an
PAC and SAC, as well as between the left AMY and all left
increase in gamma power and a strong decrease in the
auditory areas and right SAC. Interestingly, the right AMY is
beta power are observed. Color Scale: ±5 mvlt.
only correlated with the left PAC. A decrease of correlation
could also be observed between BA 22 and PAC and SAC.
Significant directionality, compared to the silent period, was
3.2. Functional coupling between auditory areas and observed from right and left PAC towards SAC and BA22 and
AMY as a function of emotion from the right to the left PAC.
Music elicits a strong right lateralized network involving
To probe interactions between AC and AMY in auditory pro- increased correlation between the right AMY and PAC. Inter-
cessing, we measured functional coupling between contacts estingly, the right hemisphere auditory network (PACeSAC
of each structure, computing the non-linear correlation coef- and BA22) is interconnected, whereas the left hemisphere
ficient (h2) on intracerebral EEG signals. Fig. 5 shows a com- auditory network shows a lower correlation compared to the
parison of nonlinear correlation in periods with pure tones or silent period. There was also no longer any involvement of the
music versus resting state on the one hand, and according to left AMY. Analysis of the correlation according to the emotion
each emotion conveyed by the excerpts on the other hand revealed the modulation of the functional connectivity be-
(ManneWhitneyeWilcoxon rank sum test thresholded at tween the AC and AMYe. An increased correlation between
p < .01 double sided). right AMY and auditory areas for happy, peaceful and sad
c o r t e x 6 0 ( 2 0 1 4 ) 8 2 e9 3 89

Fig. 5 e Diagrams of h2 connectivity (results of statistical comparing conditions) and directionality (p < .005). All the values of
connectivity are displayed in Table 1 (Supplemental material). The black arrows indicate the directionality of the
connectivity between structures expressed in ms. A e Connectivity patterns during Music (left) listening and pure tones
(right) hearing. Music listening shows a structured network involving the right AMY which is strongly connected to the PAC
structure. During Pure tones hearing, a larger network takes place where the left AMY displays a strong connectivity with
the left hemisphere auditory network. Right BA22 shows a strong disengagement in connectivity compared with resting
silence. B e Connectivity pattern during music listening, classified according to emotion (see comments in the text).

excerpts is observed. In contrast, angry excerpts led to a In a second step, we assessed whether emotion in music
widely distributed decreased correlation over auditory areas emerged from a functional coupling between AC (3 areas) and
on both sides and the left AMY. We failed to find any signifi- AMY: a 4 (emotions pieces) 4 (time periods) 4 (areas) 2
cant difference in directionality during music listening (hemisphere) ANOVA was performed, with the piece as the
compared to a period of silence. random variable. The h2 defined the dependent measure.
90 c o r t e x 6 0 ( 2 0 1 4 ) 8 2 e9 3

Results for h2 were higher in the right than left hemisphere, F decorrelation of the EEG signal and the self-evaluated
(1,36) ¼ 97.18, MSE ¼ .0073363, p < .001, and linearly increased emotional valence.
in the right hemisphere when going from the coupling PAC-
AMY to the coupling BA22-AMY, as seen by a 2 hemisphere 3 4.1. Asymmetric emotional musical processing
areas interaction, F (2,72) ¼ 6.89. MSE ¼ .0014449, p < .002. The
effect of time period was expressed in a marginally significant The observation that gamma band activity is enhanced for
two way interaction between the time periods and the emo- musical sequences compared to pure tone sequences is
tions, F (9, 108) ¼ 1.92, MSE ¼ .0053992, p < .06. As illustrated in consistent with evidence implicating gamma oscillations in
Fig. 6, the coupling between cortical areas and the AMY the processing of spectral complexity (Shahin, Roberts, Chau,
increased in musical excerpts classified as happy, and, to a Trainor, & Miller, 2008; Shahin, Trainor, Roberts, Backer, &
lesser extent, in excerpts classified as peaceful or sad, in Miller, 2010). An asymmetry in auditory cortical activity in
comparison to the preceding and following periods of silence. EEG rhythms was found during music listening, with the left
The coupling considerably decreased for music classified as AC (preferentially PAC and SAC) showing stronger high fre-
angry. The contrast was strongest between angry and happy quency fluctuations (high gamma 60e80 Hz) than the right
music, with a highly significant difference in h2 values over one, but being insensitive to emotions. In contrast, the right
the 2 musical periods for these emotions F(1,36) ¼ 12.03, AC was sensitive to emotions. This difference observed be-
MSE ¼ .076, p < .005 This demonstrates that the correlation/ tween the left and right auditory cortices could be related to
decorrelation of the EEG signal recorded from auditory areas the intrinsic cortical oscillation frequencies which are
and AMY marked the response to emotional music, with distinctive features of each sensory cortex; this tuning dif-
strong decorrelation in the right hemisphere for angry music. ference optimizes the sensitivity of the left hemisphere for
speech perception (high acoustic modulation) and the right
hemisphere for musical rhythms, musical instrument peri-
4. Discussion odicity or speech prosody (for review see, Giraud & Poeppel,
2012). In line with that reasoning, our findings showed that
In this study, we attempted to characterize the neural dy- the EEG modulation recorded from the right AC was more
namic of emotion conveyed by musical stimuli. We analyzed related than the left AC to the rhythmic pattern of the musical
continuous brain activity patterns recorded from the AC and signal in relation to emotional processing. In contrast, we did
AMY during music listening compared to a sequence of pure not observe any fluctuation in any EEG band in relation to the
tones. We used orchestral excerpts in an attentive listening emotions from the left AC, regardless of auditory area. It
task that maintained the ecological validity of the listening should be noted that this hemispheric functional asymmetry
experience. The main results were: (i) an enhanced power of for music perception is present at birth. Perani et al. (2010)
gamma band oscillations in the AC evoked by music and not showed that the fMRI BOLD signal in the right AC decreased
by pure tones, which was associated with an increased spec- with altered music compared to original music, whilst no such
tral complexity of sounds; (ii) the presence of asymmetry of difference in BOLD response was observed in the left AC.
cortical tuning between the left and right AC, the right being Beyond the right AC dominance, additional support for the
especially sensitive to emotion modulated by the variations of contribution of the right temporal lobe in music processing
the musical pieces; and (iii) the functional coupling between comes from studies of patients with brain damage to the right
the right AC and AMY, which is modulated by the correlation/ temporal lobe who have specific deficits in melody processing
geois-Chauvel, Peretz, Babai, Laguitton, & Chauvel, 1998;
(Lie
for a review, see Stewart, von Kriegstein, Warren, & Griffiths,
2006). Our analysis of frequency band dynamics across the
orchestral excerpts demonstrated a bilateral AMYr reactivity
as previously observed in functional imagery studies (Blood &
Zatorre, 2001; Koelsch, Fritz, Cramon D.Y., Muller, & Friederici,
2006; Mitterschiffthaler et al., 2007).

4.2. Functional coupling between auditory areas and the


AMY

Passive listening to a sequence of tone bursts elicited a


correlation between intra- et inter-hemispheric auditory
areas with a directionality from primary area to secondary
and associative areas, as observed in previous studies
demonstrating downstream propagation of elementary
acoustic stimuli (Gueguin, Le Bouquin-Jeannes, Faucon, &
geois-Chauvel, 2007; Kumar, Stephan, Warren, Friston, &
Lie
Fig. 6 e Functional coupling between AMY and auditory Griffiths, 2007). The functional coupling between-AC and
areas as indexed by h2 as a function of the time periods AMY evidences the anatomical connections between both
(pre-stimulation, two musical segments, and post- structures (LeDoux, 2007) but it takes place differently ac-
stimulation) and the emotional content of the music. cording to the nature of the auditory stimuli. A specific
c o r t e x 6 0 ( 2 0 1 4 ) 8 2 e9 3 91

correlation between the left AMY and the left AC is observed processed in the AC and the AMY secondarily modulates the
during tone listening, which is no longer present during AC as a function of valence.
music listening. This coupling could reflect pulse processing
elicited by the perception of roughly equally spaced and 4.3. Limitations of the study
salient tones (for review, see Grahn, 2012), driven by the
oscillatory properties of the left AC (for review, see Giraud & We must acknowledge some limitations to our study. It is a
Poeppel, 2012). single case study and as such, these results should be inter-
Interestingly, music listening drastically reduces the preted with caution. Furthermore, it is largely admitted that a
functional correlation between all these areas, suggesting that single piece of music would not induce the same emotion in
correlation/decorrelation between the AC and AMY builds up all subjects, the subjective experience, i.e., liking or disliking a
a functional connection during the listening period, the specific piece of music, depends on the individual's biography,
modulation of which is the neural signature of emotional musical exposure and preferences (Brattico & Jacobsen, 2009;
feeling. The role of such functional coupling has been studied McDermott, Lehr & Osenham, 2010; Nieminen, Istok,
in the production of semiology in epileptic seizures. It has Brattico, Tervaniemi &Huotilainen, 2011; Shahin et al., 2010).
been demonstrated that the correlation of intracerebral EEG In addition, recent studies analyzing individual brain-
between certain cortical areas in certain frequencies is frequency responses either to musical excerpts chosen by
therefore an electrophysiological counterpart of the emer- the experimenter (Lin, Duann, Chen, & Jung, 2010) or to self-
gence of some clinical signs (Chauvel & McGonigal, 2014). For selected music (Holler et al., 2012) showed inter-individual
instance, non-linear regression analysis (h2 as used in this differences in emotion-related spectral changes while
study) of depth-EEG signals showed the existence of transient listening to the music.
functional coupling between AMY and hippocampus and be- In conclusion, this case study points out that: (i) the audi-
tween hippocampus and rhinal cortex in “de  ja
 vu.”. toryelimbic interactions result from the formation of oscilla-
(Bartolomei et al., 2012). In addition, the mechanisms under- tory networks binding the activities of the network nodes,
lying visual percepts result from the phase locked synchrony promoting the emergence of a feeling and (ii) that the music is
of neurons spatially distributed in the visual system in the related to both sensory and cognitive processes preferentially
beta/gamma frequency range (Buzsa  ki & Draguhn, 2004; relayed in the right hemisphere.
Singer, 2004). In line with these empirical data, it is likely
that the induced emotion results from the functional coupling
between the AC and the AMY.
Acknowledgments
This can be emphasized by the absence of significant
directionality between structures (calculated from the h2), so
We thank Pr J Regis (Neurosurgery Department, Marseille) for
that a complex and non-stationary dialog is taking place be-
stereotactic placement of the electrodes, D Scho€ n and V Jirsa
tween top-down structures while listening to music. Strik-
for fruitful discussions and comments on the manuscript and
ingly, the emotional feeling is likely in relation to the coupling
R Mc Cully for editing english. We thank Patrick Marquis for
found between the right AC and AMY. The changes observed
dedicated help in data acquisition. We wish to thank the pa-
in iEEG signals in response to highly arousing and negative
tient C.V for her active cooperation. The work was supported
valence pieces (usually categorized as angry) drastically
by a grant (ACI Neurosciences integrative et computationelles)
differed from the changes associated with all the other pieces.
from the French Government.
One explanation could be that the patient did not listen as
attentively to angry excerpts as compared to the other musical
pieces. But passive listening to pure tones elicited a strong
correlation between AMY and AC. The distinction between
Supplementary data
evoked feeling and conveyed emotions has been a matter of
Supplementary data related to this article can be found at
long debate. The differences between felt and perceived
http://dx.doi.org/10.1016/j.cortex.2014.06.002.
emotion are difficult to disentangle but most often the nega-
tive emotions, although not reported to be felt in response to
music, are reported to be perceived as expressive properties of
references
the music (Zentner et al., 2008). But in this study, angry ex-
cerpts (even though the label of angry is not the most appro-
priate) induced an emotive state different from the others
Amaral, D. G., & Price, J. L. (1985). Amygdalo-cortical projections
since the decorrelation found between AC and AMY might in the monkey (Macaca fascicularis). The Journal of Comparative
depend on the displeasure experienced by the patient Neurology, 230, 465e496.
listening to music she did not like. The response could Bartolomei, F., Barbeau, E. J., Nguyen, T., McGonigal, A., Regis, J.,
therefore be related to emotions conveyed by music as well as Chauvel, P., et al. (2012). Rhinalhippocampal interactions
evoked feelings. There is a large body of evidence illustrating during deja
 vu. Clinical Neurophysiology, 123, 489e495.
Baumgartner, T., Esslen, M., & Ja € ncke, L. (2006). From emotion
how sound can cause functional modifications in the limbic
perception to emotion experience: emotions evoked by
system which, in turn, could modulate auditory plasticity
pictures and classical music. International Journal of
(Kumar, von Kriegstein, Friston, & Griffiths, 2012; for review, Psychophysiology, 60(1), 34e43.
see Kraus & Canlon, 2012). The connectivity analysis using the Behne, N., Scheich, H., & Brechmann, A. (2005). Contralateral
dynamic causal modeling revealed that the sound is first white noise selectively changes right human auditory cortex
92 c o r t e x 6 0 ( 2 0 1 4 ) 8 2 e9 3

activity caused by a FM-Direction task. Journal of Juslin, P. N., & Va€ stfja
€ ll, D. (2008). Emotional responses to music:
Neurophysiology, 931, 414e423. the need to consider underlying mechanisms. The Behavioral
Bigand, E., Filipic, S., & Lalitte, P. (2005). The time course of and Brain Sciences, 31, 559e621.
emotional responses to music. Annals of the New York Academy Juslin, P. N. (2013). From everyday emotions to aesthetic
of Sciences, 1060, 429e437. emotions: towards a unified theory of musical emotions.
Blood, A. J., Zatorre, R. J., Bermudez, P., & Evans, A. C. (1999). Physics of Life Reviews, 10(3), 235e266.
Emotional responses to pleasant and unpleasant music Khalfa, S., Schon, D., Anton, J. L., & Lie geois-Chauvel, C. (2005).
correlate with activity in paralimbic brain regions. Nature Brain regions involved in the recognition of happiness and
Neuroscience, 2, 382e387. sadness in music. NeuroReport, 16(18), 1981e1984.
Blood, A., & Zatorre, R. J. (2001). Intensely pleasurable responses Khalfa, S., Guye, M., Peretz, I., Chapon, F., Girard, N., Chauvel, P.,
to music correlate with activity in brain regions implicated in et al. (2008a). Evidence of lateralized anteromedial temporal
reward and emotion. Proceedings of the National Academy of structures involvement in musical emotion processing.
Sciences of the United States of America, 98, 11818e11823. Neuropsychologia, 4610, 2485e2493.
Brattico, E., Tervaniemi, M., Na €a€ ta
€ nen, R., & Peretz, I. (2006). Khalfa, S., Delbe , C., Bigand, E., Reynaud, E., Chauvel, P., &
Musical scale properties are automatically processed in the Liegeois-Chauvel, C. (2008b). Positive and negative music
human auditory cortex. Brain Research, 1117, 162e174. recognition reveals a specialization of mesio-temporal
Brattico, E., & Jacobsen, T. (2009). Subjective appraisal of music: structures in epileptic patients. Music Perception, 25(4),
Neuroimaging evidence. Annals of the New York Academy of 295e302.
Sciences, 1169, 308e317. Koelsch, S., Fritz, T., V. Cramon, D. Y., Muller, K., &
Brechmann, A., & Scheich, H. (2005). Hemispheric shifts of sound Friederici, A. D. (2006). Investigating emotion with music. A
representation in auditory cortex with conceptual listening. fMRI study. Human Brain Mapping, 27, 239e250.
Cerebral Cortex, 155, 578e587. Kraus, K. S., & Canlon, B. (2012). Neuronal connectivity and
Buzsa  ki, G., & Draguhn, A. (2004). Neuronal oscillations in cortical interactions between the auditory and limbic systems effects
networks. Science, 304, 1926e1929. of noise and tinnitus. Hearing Research, 288, 34e46.
Chauvel, P., & McGonigal, A. (2014). Emergence of semiology in Krumhansl, C. L. (1997). An exploratory study of musical
epileptic seizures. Epilepsy and Behavior. in press [Epub ahead emotions and psychophysiology. Canadian Journal of
of print]. Experimental Psychology, 51, 336e353.
Dalla Bella, S., Peretz, I., Rousseau, L., Gosselin, N., Ayotte, J., & Kumar, S., Stephan, K. E., Warren, J. D., Friston, K. J., &
Lavoie, A. (2001). Development of the happy-sad distinction in Griffiths, T. D. (2007). Hierarchical processing of
music appreciation. Does tempo emerge earlier than mode? auditory objects in humans. PLos Computational Biology,
Annals of the New York Academy of Sciences, 930, 436e438. 3(6), e 100.
Eldar, E., Ganor, O., Admon, R., Bleich, A., & Hendler, T. (2007). Kumar, S., von Kriegstein, K., Friston, K., & Griffiths, T. D. (2012).
Feeling the real world limbic response to music depends on Features versus feelings: dissociable representations of the
related content. Cerebral Cortex, 17, 2828e2840. acoustic features and valence of aversive sounds. The Journal of
Filipic, S., Tillmann, B., & Bigand, E. (2010). Judging familiarity and Neuroscience, 32(41), 14184e14192.
emotion from very brief musical excerpts. Psychonomic Bulletin Labbe, C., & Grandjean, D. (2014). Musical emotions predicted by
& Review, 17(3), 335e341. feelings of entrainment. Music Perception. in press.
Fritz, T., Jentschke, S., Gosselin, N., Sammler, D., Peretz, I., Lalitte, P., & Bigand, E. (2006). Music in the moment? Revisiting
Turner, R., et al. (2009). Universal recognition of three basic the effect of large scale structures. Perceptual and Motor Skills,
emotions in music. Current Biology, 19, 1e4. 103, 811e828.
Giraud, A. L., & Poeppel, D. (2012). Cortical oscillations and speech LeDoux, J. (2007). The amygdala. Current Biology, 17, 868e874.
processing: emerging computational principles and Leman, M. (2000). An auditory model of the role of short term
operations. Nature Neuroscence, 15(4), 511e517. memory in probe tone ratings. Music Perception, 17(4), 481e509.
Gosselin, N., Samson, S., Adolphs, R., Noulhiane, M., Roy, M., Levitin, D. J., & Menon, V. (2003). Musical structure is processed in
Hasboun, D., et al. (2006). Emotional responses to unpleasant ‘‘language’’ areas of the brain: a possible role for Brodmann
music correlates with damage to the parahippocampal cortex. area 47 in temporal coherence. NeuroImage, 20, 2142e2152.
Brain, 129, 2585e2592. geois-Chauvel, C., Musolino, A., & Chauvel, P. (1991).
Lie
Grandjean, D., Sander, D., Pourtois, G., Schwartz, S., Seghier, M. L., Localization of the primary auditory area in man. Brain, 114,
Scherer, K. R., et al. (2005). The voices of wrath: brain 139e151.
responses to angry prosody in meaningless speech. Nature geois-Chauvel, C., Musolino, A., Badier, J. M., Marquis, P., &
Lie
Neuroscience, 8(2), 145e146. Chauvel, P. (1994). Evoked potentials recorded from the
Grahn, J. A. (2012). Neural mechanisms of rhythm perception: auditory cortex in man: evaluation and topography of the
current findings and future perspectives. Topics in Cognitive middle latency components. Electroencephalography and Clinical
Science, 4, 585e606. Neurophysiology, 923, 204e214.
Gueguin, M., Le Bouquin-Jeannes, R., Faucon, G., & Lie geois- geois-Chauvel, C., Peretz, I., Babai, M., Laguitton, V., &
Lie
Chauvel, C. (2007). Evidence of functional connectivity Chauvel, P. (1998). Contribution of the different temporal
between auditory cortical areas revealed by amplitude cortical structures to music processing. Brain, 121, 1853e1867.
modulation sound processing. Cerebral Cortex, 17(2), 304e313. geois-Chauvel, C., de Graaf, J. B., Laguitton, V., & Chauvel, P.
Lie
Hoistad, M., & Barbas, H. (2008). Sequence of information processing (1999). Specialization of left auditory cortex for speech
for emotions through pathways linking temporal and insular perception in man depends on temporal coding. Cerebral
cortices with the amygdale. NeuroImage, 40, 1016e1033. Cortex, 95, 484e496.
Ho€ ller, Y., Thomschewski, A., Schmid, E. V., Ho € ller, P., Crone, J. S., geois-Chauvel, C., Giraud, K., Badier, J. M., Marquis, P., &
Lie
& Trinka, E. (2012). Individual brain-frequency responses to Chauvel, P. (2001). Intracerebral evoked potentials in pitch
self-selected music. International Journal of Psychophysiology, perception reveal a functional asymmetry of the human
86(3), 206e213. auditory cortex. Annals of the New York Academy of Sciences, 930,
Johnsrude, I. S., Penhune, V. B., & Zatorre, R. J. (2000). Functional 117e132.
specificity in right human auditory cortex for perceiving pitch Lin, Y. P., Duann, J. R., Chen, J. H., & Jung, T. P. (2010).
direction. Brain, 123, 155e163. Electroencephalographic dynamics of musical emotion
c o r t e x 6 0 ( 2 0 1 4 ) 8 2 e9 3 93

perception revealed by independent spectral components. Price, J. L. (2003). Comparative aspects of amygdale connectivity.
NeuroReport, 21(6), 410e415. Annals of the New York Academy of Sciences, 985, 50e58.
Lopes da Silva, F., Pijn, J. P., & Boeijinga, P. (1989). Samson, S., & Zatorre, R. J. (1988). Melodic and harmonic
Interdependence of EEG signals: linear vs nonlinear discrimination following unilateral cerebral excision. Brain
associations and the significance of time delays and phase Cognition, 7, 348e360.
shifts. Brain Topography, 21(2), 9e18. Schmithorst, V. J., & Holland, S. K. (2003). The effect of musical
McDermott, J., Lehr, A., & Oxenham, A. (2010). Individual training on music processing: a functional magnetic
differences reveal the basis of consonance. Current Biology, 20, resonance imaging study in humans. Neuroscience Letters, 348,
1035e1041. 65e68.
Maris, E., & Oostenveld, R. (2007). Non parametric statistical Shahin, A. J., Roberts, L. E., Chau, W., Trainor, L. J., & Miller, L. M.
testing of EEG and MEG Data. Journal of Neuroscience Methods, (2008). Music training leads to the development of timbre-
164(1), 177e190. specific gamma band activity. NeuroImage, 411, 113e122.
Menon, V., & Levitin, D. J. (2005). The rewards of music listening: Shahin, A., Trainor, L., Roberts, L., Backer, K., & Miller, L. (2010).
response and physiological connectivity of the mesolimbic Development of auditory phase-locked activity for music
system. NeuroImage, 28, 175e184. sounds. Journal of Neurophysiology., 103, 218e229.
Mitterschiffthaler, M. T., Fu, C. H. Y., Dalton, J. A., Andrew, C. M., Singer, W. (2004). Synchrony, oscillations and relational codes.
& Williams, S. C. R. (2007). A functional MRI study of happy Visual Neuroscience, 2, 1665e1681.
and sad affective states induced by classical music. Human Stewart, L., von Kriegstein, K., Warren, J. D., & Griffiths, T. D.
Brain Mapping, 21, 1150e1162. (2006). Music and the brain: disorders of musical listening.
Nieminen, S., Isto  k, E., Brattico, E., Tervaniemi, M., & Brain, 129, 2533e2553.
Huotilainen, M. (2011). The development of aesthetic Trochidis, K., & Bigand, E. (2013). Investigation of the effect of
responses to music and their underlying neural and mode and tempo on emotional responses to music using EEG
psychological mechanisms. Cortex, 47, 1138e1146. power asymmetry. Journal of Psychophysiology, 27, 142e148.
Patterson, R. D., Uppenkamp, S., Johnsrude, I. S., & Griffiths, T. D. Warrier, C. M., & Zatorre, R. J. (2004). Right temporal cortex is
(2002). The processing of temporal pitch and melody critical for utilization of melodic contextual cues in a pitch
information in auditory cortex. Neuron, 36, 767e776. constancy task. Brain, 127, 1616e1625.
Perani, D., Saccuman, M. C., Scifo, P., Spada, D., Andreolli, G., Wendling, F., Bartolomei, F., Bellanger, J. J., & Chauvel, P. (2001).
Rovelli, R., et al. (2010). Functional specializations for music Interpretation of interdependencies in epileptic signals using
processing in the human newborn brain. Proceedings of the a macroscopic physiological model of the EEG. Clinical
National Academy of Sciences of the United States of America, Neurophysiology, 112, 1201e1218.
107(10), 4758e4763. Wendling, F., Bartolomei, F., Bellanger, J. J., Bourien, J., &
Pereira, C. S., Teixeira, J., Figueiredo, P., Joao, X., Castro, S. L., & Chauvel, P. (2003). Epileptic fast intracerebral EEG activity:
Brattico, E. (2011). Music and emotions in the brain: familiarity evidence for spatial decorrelation at seizure onset. Brain, 126,
matters. PLos One, 11(6), 27241. 1449e1459.
Peretz, I., & Hebert, S. (2000). Toward a biological account of Zatorre, R. J., & Belin, P. (2001). Spectral and temporal processing
music experience. Brain Cognition, 42, 131e134. in human auditory cortex. Cerebral Cortex, 1110, 946e953.
Peretz, I., & Zatorre, R. (2005). Brain organization for music Zatorre, R. J., Belin, P., & Penhune, V. B. (2002). Structure and
processing. Annual Review of Psychology, 56, 89e114. function of auditory cortex: music and speech. Trends in
Plichta, M. M., Gerdes, A. B. M., ALpers, G. W., Harnisch, W., Cognitive Sciences, 61, 37e46.
Brill, S., Wieser, M. J., et al. (2011). Auditory cortex activation is Zentner, M., Grandjean, D., & Scherer, K. R. (2008). Emotions
modulated by emotion: a functional near-infrered evoked by the sound of music: characterization, classification
spectroscopy (fNIRS) study. NeuroImage, 55, 1200e1207. and measurement. Emotion, 8(4), 494e521.

You might also like