You are on page 1of 10

Exploring relationships between expressive and structural

elements of music and pianists’ gestures


Marc R. Thompson
Finnish Centre of Excellence in Interdisciplinary Music Research
Department of Music, University of Jyväskylä, Finland
mathomps@jyu.fi - http://users.jyu.fi/∼mathomps

Geoff Luck
Finnish Centre of Excellence in Interdisciplinary Music Research
Department of Music, University of Jyväskylä, Finland
luck@campus.jyu.fi - http://users.jyu.fi/∼luck

Proceedings of the fourth Conference on Interdisciplinary Musicology (CIM08)


Thessaloniki, Greece, 3-6 July 2008, http://web.auth.gr/cim08/

Background in computing mathematics and statistics. High-resolution motion capture systems have been used
in several studies to investigate connections between music and movement (Wanderley, Vines, Middleton, McKay &
Hatch, 2005; Eerola, Luck & Toiviainen, 2006, 2007; Luck & Sloboda, 2008). The use of such systems allows one to
quantify relationships between music and movement, especially when a combination of movement and audio features
are extracted and analyzed computationally. For example, musicians’ movements can be parsed into trajectory
vectors, from which basic kinematic features such as velocity and acceleration can be calculated. Specific movements
can then be linked back to specific musical content and structure, and relationships between the two identified.

Background in psychology of music. In the embodied view of music cognition (Leman, 2008), the body acts as the
mediator between mind and physical energy. In other words, musical intentionality, as it relates to individual
expression or musical structure is manifested by corporal articulations. Already, it has been shown that with
movement alone, people can perceive the performance manner of musicians (Davidson, 1993), the emotional
characteristics of dancers (e.g., Dittrich, Troscianko, Lea, & Morgan, 1996) or systematic relationships between
expressive dance and music (Krumhansl & Schenk, 1997). Pianists make interesting subjects because their expressive
gestures are constrained to their sitting position yet their gestures enhance musical interaction and communication.

Aims. To explore relationships between expressive and structural elements of music and features of musicians’
movement during performance.

Method. Three pianists performed individually a piece of music three times at different levels of expression. Audio
recordings were made of the performances while the participants’ movements were recorded using an eight-camera
optical motion capture system. The kinematic features speed and acceleration for each body part was correlated with
the audio’s RMS amplitude in order to explore how playing in different levels of expression affects overall movement.

Results. Cross-correlations resulted in a series of low to medium correlation coefficients. However, some
consistencies for individual body parts were found. First, a tendency towards a negative lag for the head and shoulder
markers, for both speed and velocity, indicated that head and shoulder movement tended to precede peaks in RMS
amplitude. Second, a tendency towards a positive lag for the wrists and elbows, for both speed and velocity, suggests
that movement of the arms tended to follow peaks in RMS amplitude. Also in some instances, the head coefficient
rose between each expressive manner, indicating that amplified head movement played a role in playing with more
expression.

Conclusions. Relationships between musicians’ body movement and their expressive performance have been
quantified empirically. The temporal evolution of head and elbows within a musical performance indicate their
respective roles in piano playing. The head follows the phrase structure but moves in anticipation of musical events
and the elbow, as a kinetic feedback immediately follows musical events. The point that their coefficient values rise as
the level of expression increases reveals that amplified body movement plays an important role in communicating
musical expression.

Implications. The combination of empirical methods of music psychology and sophisticated mathematical, statistical,
and signal processing methods has produced formalized knowledge on piano performance that may be applied to both
music education and live performance.
Musical activity can be seen as a holistic musical mind and physical energy. In other
phenomenon (Aldridge, 1996) in which words, musical intentionality, as it relates to
cerebral intentionality and corporal behavior individual expression or musical structure is
are intricately linked, culminating in an manifested by corporal articulations. These
expressive or emotionally stimulating event. articulations or gestures serve as a
As a multimodal experience, the elements of communication tool in musical activities.
the musical apparatus are complex and
Embodied music cognition stresses the
difficult to define. Still, the view that corporal
importance of studying musical phenomena
behavior enhances musical performances by
within its ecological setting, in which all
acting as a mediator of the musician’s
modalities are present. This framework is
intended expression has in the last 20 years
seen as an alternative to the more abstract
become an increasing focus of empirical
and rule-based strategies of studying music
investigations. Past studies have shown that
phenomena. While rule-based paradigms can
audiences are able to differentiate between
be used to test specific models (e.g. melodic
different levels of expression during
similarity), they may not take into account
performances in both the auditory (Kendall &
elements of musical phenomena occurring
Carterette, 1990) and visual domains
outside the rules of the current model
(Davidson, 1993 & 1995) or recognize
(Jensenius, 2007). Embodied music cognition
different emotions from watching
takes into account that music is a multi-
performances without sound (Dahl & Friberg,
modal experience, elegantly suited for the
2003).
cross-modal capacities of the human mind.
Corporal movement as an overt manifestation
of the musician's expressive intentions and Embodied musical structure
goals has been the subject for various case It has been shown that embodying a musical
studies of famous musicians known for structure, particularly one related to the
expressive, sometimes eccentric physical temporal organization of music (meter,
display during performances (e.g. Thompson, tempo), plays the most critical role in beat
Graham & Russo, 2005; Eldson, 2006; induction and synchronization. Luck &
Davidson, 2006) or even during studio Sloboda (2008) studied the spatio-temporal
recorded performances (e.g. Delalande, 1988, properties of conducting gestures. In a study
1995, etc.). These studies demonstrate that in which participants synchronized with point-
the body movement attains meaning by light displays of conducting features, it was
adding an extra-musical dynamic to found that beat induction was mediated by
performances because they are presented in changes in speed along the trajectory of a
a social or idiosyncratic context. gesture and had less to do with changes in
By stressing the communicative aspect of direction of said gesture. A related study
expressive body movement, these studies (Luck & Toiviainen, 2006), in which data was
help indicate that expressive movement collected in a more ecological setting by
should not be thought of as a subordinate to attaching markers to a conductor while
sound, but plays a prominent role within a directing an ensemble, revealed that the
musical experience. The aim of this study is ensemble tended to synchronize with points
to explore the relationships between musical of deceleration of the baton’s marker.
structure, as in musical phrases, and A musical structure that represents musical
expressive body movement during piano form, such as phrase structure, can also be
performances. perceived through movement alone.
Krumhansl & Schenck (1997) conducted a
Embodied Music Cognition study to explore the structural and expressive
Embodied music cognition (Leman, 2008) is a relationships between music and dance. While
research discipline that is interested in being exposed to dance performances in one
studying the role of the human body in all of three conditions (Music and Dance, Dance
musical activities. Under this framework, the only or Music only), participants indicated the
body acts as the mediator between the occurrence of section ends and new ideas,
CIM08 - Conference on Interdisciplinary Musicology - Proceedings

and judged the amount of tension and Recent literature has sought to systemize
emotion expressed. Even in the dance alone musical gestures into two or more
condition, participants were able to dichotomies. The idea is to differentiate
distinguish section boundaries and new ideas, between gestures that produce sound and
which correlated with musical phrase those that accompany those that produce
boundaries. sounds (Delalande, 1988, 1995; Cadoz &
Wanderley, 2000). The latter category has
It has been noted that different types of
also been called ancillary (Wanderley, 1999)
gestures convey different types of
despite the prominent role it plays in
expressivity. For example, in a study in which
communicating the musician’s intended
observers were asked to continuously rate
musical expressions while tracing the
tension and phrasing of bodily movements in
evolution of sound (Godøy, 2006). Through
a series of clarinet performances (Vines,
mimicking, anticipating and re-acting to
Krumhansl, Wanderley & Levitin, 2006), it
musical events, musical gestures articulate
was found that gestures associated with
the musician’s own impression of the music
tension were related to expressivity while
being performed while embodying the music’s
gestures associated with phrasing indicated
structure with kinetic feedback.
musical structure.
Luck & Toiviainen (2007) explored the Quantification
relationships between posture and vocal
Audio data quantification techniques have
performance by extracting a number of
undergone considerable development in
kinematic features from motion capture data
recent years, and there are many different
and correlating this data with the
approaches based upon principles including
corresponding audio tracks of the vocal
signal processing, machine learning, cognitive
performance. Most notably, it was found that
modeling, and visualization (Downie, 2003).
specific head positions influenced the voice’s
Such techniques have been used, for
timbre. Specifically, spectral irregularity (≈
example, in areas such as computational
noisiness) increased when the head was tilted music analysis (e.g., Lartillot, 2004, 2005),
downwards, while RMS amplitude (≈ improvisional music therapy analysis (Luck et
loudness) increased when the head was tilted al., 2006; Luck et al., 2008), and
upwards. Tilting the head downwards may quantification of the singing voice (Sundberg,
obstruct the vocal apparatus, thus causing 1987), to name but a few.
more noisiness in the signal. Tilting the head
upwards, on the other hand, could have the Movement data quantification techniques
opposite effect, freeing the vocal apparatus, have developed in parallel with the audio
and permitting a greater flow of air. techniques mentioned above, and frequently
utilise high-quality motion-capture data.
There is also evidence that even in Areas under investigation have included
performances with no expressive intentions, performing musicians’ movements (e.g.,
ancillary movement occurs causing Wanderley, Vines, Middleton, McKay, & Hatch,
modulations in sound, accounting for the 2005), dance movements (e.g. Dittrich,
naturalness that is often said to be absent Troscianko, Lea, & Morgan, 1996) and
from synthesis techniques (Wanderley, conductors’ gestures (e.g., Luck, 2000; Luck
Depalle & Warusfel, 1999). & Nte, 2007; Luck & Sloboda, 2008).
Gesture Categorization Moreover, the movement- and audio-based
approaches have been combined in several
As was stated in the introduction, body
studies examining, for instance,
movement in music performance attains
expressiveness in audio and movement
significance when contextualized within the
(Camurri, De Poli, Friberg, Leman, & Volpe,
musician’s intended musical expression. As a
2005; Camurri, Lagerlöf, & Volpe, 2003),
corporal articulation is bestowed with
children’s rhythmic movement to music
significance, it can be categorized as a
(Eerola, Luck, & Toiviainen, 2006), and
musical gesture.

3
CIM08 - Conference on Interdisciplinary Musicology - Proceedings

conductor-musician synchronization (Luck & play in three expressive conditions ranging


Toiviainen, 2006). from no expression to exaggerated
expression, we expected to be able to discern
The present study what types of body movements were
This study examines expressive body Method 1: Data collection
movement specific to piano performance. We
have broken down the motor movement Participants. Three pianists volunteered to
employed in piano performance into two loose take part in the study. Player 1 (b. 1982,
dichotomies: movements used for function Finnish, 15 years playing experience) was a
and movements used for expressivity. The student in Piano Pedagogy at the Jyväskylä
category into which a specific movement falls University of Applied Sciences. Player 2 (b.
may depend on the degree of freedom 1978, Finnish, 22 years playing experience)
allocated to that body part during had received her Bachelors in Piano
performance. The fingers, wrists and lower Performance from the same university one
back, for example, are physically required to year earlier. Player 3 (b. 1977, Hungarian, 17
play the notes on the piano and their total years playing experience) was a visiting
movement is restricted by the specific music researcher in Musicology at the University of
being played. The shoulders and the head, Jyväskylä’s music department. All participants
meanwhile, are freer to move and not as received an honorarium for their participation
involved in the production of sound. in the form of gift vouchers.
Therefore, we might expect a pianist to use
Apparatus. Audio recordings of all
their head and shoulders, as opposed to their
performances were made in a professional
fingers, wrists, or lower back, to embody
recording studio using a Yamaha C7
musical expression.
Disklavier, a high quality microphone and
In the present study, three pianists were ProTools recording software. In addition,
asked to play the same piece (Brahms: fifteen reflective markers were attached to
Intermezzo in A major, Opus 118 # 2) in key locations on the body (four on the head,
three different pre-determined expressive one on each shoulder, one at the centre of
dispositions (minimum expression, normal the back, two on the lower back, two on the
expression & maximum expression). This elbows, two on the wrists, one on each
paradigm had been already been utilized in middle finger), and two markers were placed
perceptual studies (Davidson 1993, 1995; at each end of the keyboard to act as
Wanderley et al., 2005). We employed it here reference points. The three-dimensional
to investigate if different levels of expressivity spatial position of the markers was recorded
affect the musician’s corporal behavior. at 120 fps using an eight-camera optical
motion capture system (Qualisys ProReflex).
We were most interested in relationships
between musical phrase structure in relation Procedure. After the 15 markers were placed
to head and shoulder gestures. We on the participants, they were invited to
hypothesized that in the case of piano practice at the piano to get used to playing
performances, these gestures would play at with markers attached to their body. Once
least a small role in conveying musical comfortable with the setting, they were
phrases and an embodiment. The different instructed to play the first sixteen measures
manners of expression should reveal a of the Brahms Intermezzo in A major Opus
heightened amount of head and shoulder 118 #2 three times, each time using one of
movement influenced by the music’s phrase three levels of expression: Minimum
boundaries. The musical excerpt can be expression, normal expression and maximum
thought of as having a typical phrase and expression. The words “movement” and
harmonic structure of the Romantic era. We “gesture” were avoided as far as possible.
anticipated that the lyrical quality of the The goal was to let each pianist interpret for
melody and rich harmony would be an him or herself what was meant by different
influence on the participants’ expressive body levels of expression. Meanwhile, the Brahms
movements. Also, by having the participants

4
CIM08 - Conference on Interdisciplinary Musicology - Proceedings

Intermezzo was chosen because each (to derive a number of frames equal to the
participant had had experience playing it. length of the movement trajectories).

Method 2: Computation The root mean square (RMS) of each frame


was additionally estimated using the MIR
Extracting movement features. In order to Toolbox and stored in a time series. Although
correlate the pianists’ movements with their we extracted and experimented with other
performances, we extracted two kinematic features, including spectral flux, spectral
features related to the velocity and centroid and spectral irregularity, we
acceleration of the participants’ body concluded that RMS amplitude would be the
movements. The method used to extract the most useful for the current analysis as it is a
features was used by one of the current good indicator of temporal sound energy and
authors in Luck & Toiviainen (2006). accurately depicts the music’s phrasal
structure. In order to conform to the
The raw movement data acquired from the movement variables as well as to attain a
motion capture system was imported into signal represents the major fluctuations in
MATLAB as times series data representing the sound energy; the RMS was passed through
location of each marker on the three the same median filter as the movement
dimensions of the Cartesian coordinate variables.
system. Using a numerical differentiation
algorithm, the components of speed and Results
velocity of each dimension were obtained for
the head, shoulder, lower-back, elbow, wrist Temporal relationships between the
and finger markers. The movement variables movement features and RMS amplitude were
used for our analysis, instantaneous speed examined using a series of cross-correlation
and acceleration, were then derived by analyses. Specifically, cross-correlations were
calculating the length of the vector calculated between RMS amplitude and each
comprising of the velocity components and of the two movement features for each of the
those for acceleration components. To marker locations at each of the performance
reduce the amount of noise in the data and manners for each pianist, using lags up to
obtain a curve representing the larger body +/1 second (+/-120 frames). This lag value
movements, the data was passed through a was chosen as it was considered that
running median filter with a window size of 15 significant corporal articulations would rarely
frames. To further facilitate analysis, several occur more than one second before or after
variables were then averaged together its related musical event.
according to marker type. Thus, the 4 head
The results of this cross-correlation procedure
markers were averaged together to obtain the
were a series of low to medium correlation
trajectory of a single phantom head marker.
coefficients. However, put into proper context
Likewise, the two shoulder markers and two
of music performance, it is possible to make
lower back markers were also averaged
observations that either suggest
together to obtain trajectories of a single
commonalities across each participant or
phantom shoulder and lower back marker,
predict contrasting performance styles
respectively.
between each performer.
Extracting audio features. We extracted
The results of these analyses are summarized
several audio features relating to sound
in Figure 1, which shows the maximum
energy, phrase structure and harmony using
correlation and its respective lag for each
the MIR Toolbox (Lartillot and Toiviainen,
movement feature, at each expressive
2007). The audio recordings were imported
manner, for each performance of each
into MATLAB and were passed through a
participant.
frame decomposition algorithm with a window
length of 0.023 seconds and hop value of 120

5
Figure 1. The scatter plots show the maximum correlation plotted at its lag value for each
performance of each participant. Data is presented for the correlations of speed and RMS
amplitude and acceleration and RMS amplitude. Participant 3 played the set of
performances twice.
It can be seen that both participants 1 and 2
were fairly consistent in the way they moved
across the three levels of expression. This is
revealed by the fact that the highest
correlations for each marker type tend to
occupy a similar position space, i.e.,
similar correlation coefficient and similar lag,
regardless of the level of expression.
Participant 3, on the other hand, tended to
move differently depending upon the level of
expression he intended to convey in his
playing. This is demonstrated by the
differences in correlation coefficients and lags
for the various marker locations at the three
levels of expression. These differences are
apparent in both examples of his playing (set
1 and set 2).
In terms of relationships between specific
markers and RMS amplitude, a couple of
points are worth mentioning. First, a
tendency towards a negative lag for the head
and shoulder markers, for both speed and
velocity, indicates that head and shoulder
movement tended to precede peaks in RMS
amplitude. Second, a tendency towards a
positive lag for the wrists and elbows, for
both speed and velocity, suggests that
movement of the arms tended to follow peaks
in RMS amplitude.
These relationships can also be seen in Figure
2, which shows time-series data for selected
marker locations plotted along with RMS
amplitude. This visual presentation highlights
the relationship between a marker’s
trajectories with the audio’s temporal
evolution. These figures were prepared by
passing the features through a zero-phase
digital filter, which passes the data through
the algorithm twice, in forward and reverse
respectively. This application smoothens the
data by eliminating less significant peaks,
emphasizing general relationships.
Furthermore, the feature data were
normalized to enable them to be viewed on
Figure 2. The speed for marker types have been the same axes.
superimposed over the corresponding performance’s RMS
amplitude to observe the relationships between features. In Figure 2a and 2c, a relationship is seen
Notice how the head movements (a and c) anticipate between the head marker’s speed and RMS
peaks of sound energy while the right elbow and right
wrist movements slightly precede attacks. amplitude for Participant 2 and Participant 3.
While the correlation coefficient for these
variables was relatively low (Figure 2a:
CIM08 - Conference on Interdisciplinary Musicology - Proceedings

coefficient of 0.35 with a maximum lag value envelope of the overarching musical
of -0.133 seconds and Figure 2c: coefficient structure. If one considers the musical phrase
of 0.37 with a maximum lag value of -0.7 structure as being analogous to periodic
seconds), one can still discern a relationship processes such as breathing, it may be
between both features in which high reasonable to assume that head gestures
occurrences of head marker speed anticipate ‘breath’ with the music.
strong bursts of sound energy. The opposite
The ‘different levels of expression’ paradigm
can be said in Figures 2b and 2d. Here, the
was employed to investigate if corporal
bursts of speed for the right elbow (2b) and
behavior was affected by performing with less
right wrist (2d) markers occur after the sound
or more expression than usual. Figure 1
event (Figure 2b: coefficient of 0.37 with a
shows that the body movements of
maximum lag value of 0.39 seconds and
Participants 1 and 2 did not fluctuate very
Figure 2d: coefficient of 0.19 with a
much in different performance manners. It is
maximum lag value of +0.74 seconds).
actually telling how the speed and
Discussion acceleration markers stay in the same
location in the scatter plot throughout
This study focused on exploring the different performances, let alone levels of
relationships of corporal movement and expression. Only Participant 3 has data that
gestures with musical structure at different shows that his movements changed
levels of expressivity. We extracted the drastically between different performance
kinematic features speed and acceleration manners. This is seen in the sporadic
from movement data and correlated this with placement of the elements in the scatter
the RMS amplitude feature extracted from plots. The circles, representing the head
audio. These correlations resulted in relatively markers in Figure 1 indicate show that the
low values of correlation coefficients. correlation coefficient increased as the level
However, when put into the context of music of expression increased, resulting in faster
performance, these values nevertheless and larger head gestures. Further analysis
revealed the function of individual parts of the needs to be done to determine why this might
body and indicated whether body movement be. But in the end, it may come down to
changed with increased expression. artistic and stylistic differences.
Figure 2 shows how the head marker The answer to this revealed itself in
anticipates musical events and elbow and subsequent interviews with the participants.
wrist markers occur after the musical event. Participant 3, for example, defined playing
The head markers may have increased in with minimum expression as to play
speed before bursts of sound energy for robotically, without nuance and ignoring the
various reasons. Possibly, the participants score’s dynamic or expressive indicators.
were preparing to play large chords or Participant 1, on the other hand, defined
increase their sound. However it could also be playing without minimum expression as only
linked to expressivity as a way of preparing following the score’s dynamic and expressive
themselves for a moment in the music to indicators. In other words, without her own
which they want to draw attention. The bursts expressive ideas and input. Interestingly, in
of elbow and wrist speed after the musical terms of individual expression and
events can be seen as a kinetic feedback intentionality, Participant 1’s definition makes
following the initial point of hitting the piano more sense for she was removing herself
keys. The energy used to strike the piano from the performance while Participant 1
keys is deflected to the elbow. used only his intention, removing the
composer’s intentionality. Despite this
Taking the view that some musical gestures anecdotal side note, the contradiction in
are specifically related to the logistics of piano defining what it means to play with
playing while others are related to expression needs to be taken into account for
communicating expression, it may be future studies concerning the embodiment of
reasonable to suggest that the head markers musical structure.
in Figure 2 are characterizing the temporal

8
CIM08 - Conference on Interdisciplinary Musicology - Proceedings

The combination of empirical methods of Gritten & E. King (Eds.), Music and Gesture
music psychology and sophisticated (pp. 209-225). Hampshire: Ashgate
mathematical, statistical, and signal Publishing, Ltd.
Dahl, S., A. Friberg (2003). What Can The Body
processing methods has produced formalized
Movements Reveal About a Musician’s
knowledge on piano performance that may be Emotional Intentions. Proceedings of the
applied to both music education and live Stockholm Music Acoustics Conference,
performance. In both cases, these results (SMAC 03), Stockholm, Sweden. pp.599-
could help students and performers improve 602.
their ability to convey expressive and Delalande, F. (1988). La gestique de Gould. In G.
structural elements of the music being Guertin (Ed.), Glenn Gould: Pluriel, ( pp. 85-
played. 111). Montréal: Courteau.
Delalande, F. (1995). Meaning and behavior
However, our results revealed that correlating patterns: The creation of meaning in
musical structure with body movement using interpreting and listening to music. In E.
a computational method remains problematic Tarasti (Ed.) Musical Signification Essays in
and inconclusive. Future analysis will include the Semiotic Theory and Analysis of Music.
a wider range of movement and audio (pp. 219-228). Berlin: Mouton de Gruyter.
Dittrich, W. H., Troscianko, T., Lea, S. E. G., &
features. Additional participants could also
Morgan, D. (1996). Perception of emotion
used, which would validate the use of from dynamic light-point displays
statistical methods in order to find represented in dance. Perception, 25, 727-
commonalties between musicians. 738.
Downie, J. S. (2003). Music information retrieval.
In B. Cronin (Ed.) Annual Review of
Information Science and Technology 37.
References Medford, NJ: In-formation Today, 295Ð340.
Eerola, T., Luck, G., & Toiviainen, P. (2006) An
Aldridge, D. (1996). Music therapy research and
investigation of pre-schoolers’ corporeal
practice in medicine: From out of the
synchronization with music. In M. Baroni, A.
silence. London & Bristol: Jessica Kingsley
R. Addessi, R. Caterina & M. Costa (Eds.)
Publishers.
Proceedings of the 9th International
Cadoz, C. & Wanderley M.M., (1999). Gesture -
Conference on Music Perception & Cognition,
music. In M. Wanderley, M. Battier, (Eds.),
Bologna, 2006, pp. 472-476, Bologna, Italy:
Trends in Gestural Control of Music. Ircam.
ICMPC and ESCOM.
Camurri A., Lagerlöf I., Volpe G., (2003).
Eldson, P. (2006). Listening in the gaze: the body
Recognizing emotion from dance movement:
in Keith Jarrett’s solo piano improvisations.
Comparison of spectator recognition and
In A. Gritten & E. King (Eds.), Music and
automated techniques. International Journal
Gesture (pp. 209-225). Hampshire: Ashgate
of Human-Computer Studies, 59(1-2), 213-
Publishing, Ltd.
225.
Godøy, R. I. (2006). Gestural-sonorous objects:
Camurri, A., De Poli, G., Friberg, A., Leman, M., &
Embodied extensions of schaeffer's
Volpe, G. (2005). The MEGA project:
conceptual apparatus. Organised Sound,
Analysis and synthesis of multisensory
11(02), 149-157.
expressive gesture in performing art
Jensenius, A. R. (2007). “Action - Sound:
applications. Journal of New Music Research,
Developing Methods and Tools to Study
34(1), 5-21.
Music-related Body Movement”. PhD thesis.
Davidson, J. W. (1993). Visual perception of
Department of Musicology. University of
performance manner in the movements of
Oslo, Norway.
solo musicians. Psychology of Music, 21,
Kendall, R. A. & Carterette, E. C. (1990). The
103-113.
communication of musical expression. Music
Davidson, J. W. (1995). What Does the Visual
Perception, 8, 129-164.
Information Contained in Music
Krumhansl, C. L., & Schenk, D. L. (1997). Can
Performances Offer the Observer? Some
dance reflect the structural and expressive
Preliminary Thoughts. Music and the Mind
qualities of music? Musicae Scientiae, 1, 63-
Machine: The Psychopathology of the Sense
83.
of Music XXX: pp. 105-113.
Lartillot, O. (2004). A Musical Pattern Discovery
Davidson, J. W. (2006). ‘She’s the one’: Multiple
System Founded on a Modelling of Listening
Functions of Body Movement in a Stage
Strategies. Computer Music Journal, 28(3),
Performance by Robbie Williams. In A.
53–67.

9
CIM08 - Conference on Interdisciplinary Musicology - Proceedings

Lartillot, O. (2005). Multi-dimensional motivic Thompson, W.F., Graham, P., & Russo, F.A.
pattern extraction founded on adaptive (2005). Seeing music performance: Visual
redundancy filtering. Journal of New Music influences on perception and experience.
Research, 34(4), 375–393. Semiotica 156(1), 203–227.
Lartillot, O. & Toiviainen, P. (2007). MIR in Matlab Vines, B.W., Krumhansl, C.L., Wanderley, M.M., &
(II): A Toolbox for Musical Feature Levitin D.J. (2006). Cross-modal interactions
Extraction From Audio. International in the perceptions of musical performance.
Conference on Music Information Retrieval, Cognition, 101, 80-103.
Vienna, 2007. Wanderley, M.M. (1999). Non-obvious performer
Leman, M. (2008). Embodied Music Cognition and gestures in instrumental music. Proceedings
Mediation Technology. Cambridge, Mass.: of the International Gesture Workshop
MIT Press. (GW’99), Gif-sur-Yvette, France, 1739/1999,
Luck, G. (2000). Synchronizing a motor response 37-48.
with a visual event: The perception of Wanderley, M. M., Depalle, P., & Warusfel, O.
temporal information in a conductor’s (1999). Improving instrumental sound
gestures. In C. Woods, G. Luck, R. Brochard, synthesis by modeling the effects of
F. Seddon, & J. A. Sloboda (Eds.), performer gesture. Proc. of the 1999
Proceedings of the Sixth International International Computer Music Conference.
Conference on Music Perception and San Francisco, Calif.: International Computer
Cognition (CD Rom). Keele: Keele Music Association, 418–421.
University. Wanderley, M. M., Vines, B., Middleton, N., McKay,
Luck G. & Nte, S. (2008) A new approach to the C., & Hatch, W. (2005). The Musical
investigation of conductors' gestures and Significance of Clarinetists' Ancillary
conductor-musician synchronization, and a Gestures: An Exploration of the Field.
first experiment. Psychology of Music, 36(1), Journal of New Music Research, 34(1), 97-
81-99. 113.
Luck, G., Riikkilä, K., Lartillot, O., Erkkilä, J.,
Toiviainen, P., Mäkelä, Pyhäluoto, Raine,
Varklia, & Värri (2006) Exploring
relationships between level of mental
retardation and features of music therapy
improvisations: a computational approach.
Nordic Journal of Music Therapy, 15(1), 30-
48.
Luck, G. & Sloboda, J. (2008). Exploring the
spatio-temporal properties of simple
conducting gestures using a synchronization
task. Music Perception, 25(3), 225-239.
Luck, G. & Toiviainen, P. (2006). Ensemble
musicians' synchronization with conductors'
gestures: an automated feature-extraction
analysis. Music Perception, 24(2), 195-206.
Luck, G. & Toiviainen, P. (2007) Ideal singing
posture: evidence from behavioural studies
and computational motion analysis. In K.
Maimets-Volk, R. Parncutt, M. Marin & J.
Ross (Eds.) Proceedings of the third
Conference on Interdisciplinary Musicology
(CIM07) Tallinn, Estonia, 15-19 August
2007.
Luck, G., Toiviainen, P., Erkkilä, J., Lartillot, O.,
Riikkilä, K., Mäkelä, A., Pyhäluoto, K., Raine,
H., Varklia, L. & Värri, J. (2008) Modelling
the relationships between emotional
responses to, and musical content of, music
therapy improvisations. Psychology of Music,
36(1), 25-45.
Sundberg, J. (1987). The science of the singing
voice. Northern Illinois University Press.

10

You might also like