Professional Documents
Culture Documents
(GRIF) DRUMMOND - Jon 2009 - Understanding Interactive Systems
(GRIF) DRUMMOND - Jon 2009 - Understanding Interactive Systems
JON DRUMMOND
MARCS Auditory Laboratories/VIPRE, University of Western Sydney, Penrith South DC, NSW, 1797, Australia
E-mail: j.drummond@uws.edu.au
URL: www.jondrummond.com.au
This article examines differing approaches to the definition, An interactive system has the potential for variation
classification and modelling of interactive music systems, and unpredictability in its response, and depending on
drawing together both historical and contemporary practice. the context may well be considered more in terms of
Concepts of shared control, collaboration and conversation a composition or structured improvisation rather than
metaphors, mapping, gestural control, system responsiveness
an instrument. The concept of a traditional acoustic
and separation of interface from sound generator are
instrument implies a significant degree of control,
discussed. The article explores the potential of interactive
systems to facilitate the creation of dynamic compositional repeatability and a sense that with increasing practice
sonic architectures through performance and improvisation. time and experience one can become an expert with the
instrument. Also implied is the notion that an instru-
ment can facilitate the performance of many different
compositions encompassing many different musical
1. INTRODUCTION styles. Interactive systems blur these traditional distinc-
I have explored interactive systems extensively in my tions between composing, instrument building, systems
design and performance. This concept is far from new.
own creative sound art practice, inspired by their
Mumma (1967), in developing his works for live elec-
potentials to facilitate liquid and flexible approaches
tronics and French horn, considered both composing
to creating dynamic sonic temporal structures and
and instrument building as part of the same creative
topographies while still maintaining the integrity and
process. For Mumma, designing circuits for his cyber-
overall identity of an individual work. Just as a
sonics was analogous to composing. Similarly, the design
sculpture can change appearance with different per-
of system architectures for networked ensembles such as
spectives and lighting conditions, yet a sense of its
unique identity is still maintained, so too an inter- The Hub (Brown and Bischoff 2002) and HyperSense
Complex (Riddell 2005) is integrally linked to the pro-
active sound installation or performance may well
cess of creating new compositions and performances.
sound different with subsequent experiences of the
work, but still be recognisable as the same piece.
However, the term interactive is used widely across 1.1. Shared control
the field of new media arts with much variation in its A different notion of instrument control is presented
precise application (Bongers 2000; Paine 2002). This by interactive systems from that usually associated with
liberal and broad application of the term interactive acoustic instrument performance. Martirano wrote
does little to further our understanding of how such of guiding the SalMar Construction – considered to be
systems function and the potentials for future develop- one of the first examples of interactive composing
ment. The description of interactive in these instances is instruments (Chadabe 1997: 291) – through a perfor-
often a catchall term that simply implies some sense mance, referring to an illusion of control. Similarly,
of audience control or participation in an essentially with respect to his own interactive work Chadabe
reactive system. Furthermore, with specific reference to (1997: 287) describes sharing the control of the music
interactive sound-generating systems, there is consider- with an interactive system. Schiemer (1999: 109–10)
able divergence in the way they are classified and refers to an illusion of control, describing his interactive
modelled. Typically such systems are placed in the instruments as improvising machines, and compares
SMI context of Digital Musical Instruments (Miranda and working with an interactive system to sculpting with
Classification Wanderley 2006), focusing on interface design, gesture soft metal or clay. Sensorband performers working with
sonification (Goina and Polotti 2008) and mapping, the Soundnet (Bongers 1998) also set up systems that
defining a system in terms of the way inputs are routed exist at the edge of control, due no less in part to the
to outputs, overlooking the equally important and extreme physical nature of their interfaces.
interrelated role of processing. However, the term
interactive still has relevance, as it encompasses a
1.2. Collaboration
unique approach to compositional and performative
music-making, hence the need for this paper, drawing Interactive systems have recently had wide application
together both historical and contemporary practice. in the creation of collaborative musical spaces, often
Organised Sound 14(2): 124–133 & 2009 Cambridge University Press. Printed in the United Kingdom. doi:10.1017/S1355771809000235
Encompass = abranger
Understanding Interactive Systems 125
with specific focus for non-expert musicians. Blaine’s Referring to Martirano’s SalMar Construction and his
Jam-O-Drum (Blaine and Perkis 2000) was specifically own CEMS System, Chadabe writes of these early
designed to create such a collaborative performance examples of interactive instruments:
environment for non-expert participants to experience
These instruments were interactive in the same sense that
ensemble-based music-making. This notion of the performer and instrument were mutually influential. The
tabletop as a shared collaborative space has proved to performer was influenced by the music produced by
be a powerful metaphor, as revealed by projects such the instrument, and the instrument was influenced by
as the reacTable (Kaltenbrunner, Jordà, Geiger and the performer’s controls. (Chadabe 1997: 291)
Alonso 2006), Audiopad (Patten, Recht and Ishii 2002)
These systems were programmable and could be per-
and Composition on the Table (Blaine and Fels 2003).
formed in real-time. Chadabe highlights that the musical
Interactive systems have also found application pro-
outcome from these interactive composing instruments
viding musical creative experiences for non-expert
was a result of the shared control of both the performer
musicians in computer games such as Iwai’s Electro-
and the instrument’s programming, the interaction
plankton (Blaine 2006).
between the two creating the final musical response.
1.3. Definitions, classifications and models Programmable interactive computer music systems
such as these challenge the traditional clearly delineated
The development of a coherent conceptual frame- western art-music roles of instrument, composer and
work for interactive music systems presents a number performer. In interactive music systems the performer
of challenges. Interactive music systems are used in can influence, affect and alter the underlying composi-
many different contexts including installations, net- tional structures, the instrument can take on performer-
worked music ensembles, new instrument designs and like qualities, and the evolution of the instrument itself
collaborations with robotic performers (Eigenfeldt may form the basis of a composition. In all cases the
and Kapur 2008). These systems do not define a composition itself is realised through the process of
specific style – that is, the same interactive model can interaction between performer and instrument, or
be applied to very different musical contexts. machine and machine. In developing interactive works
Critical investigation of interactive works requires the composer may also need to take on the roles of, for
extensive cross-disciplinary knowledge in a diverse example, instrument designer, programmer and per-
range of fields including software programming, hard- former. Chadabe writes of this blurring of traditional
ware design, instrument design, composition techniques, roles in interactive composition:
sound synthesis and music theory. Furthermore, the
structural or formal musical outcomes of interactive When an instrument is configured or built to play one
systems are invariably not static (i.e., not the same every composition, however the details of that composition
might change from performance to performance, and when
performance), thus traditional music analysis techniques
that music is interactively composed while it is being per-
derived for notated western art music are inappropriate formed, distinctions fade between instrument and music,
and unhelpful. Not surprisingly, then, the practitioners composer and performer. The instrument is the music. The
themselves are the primary source of writing about composer is the performer. (Chadabe 1997: 291)
interactive music systems, typically creating definitions
and classifications derived from their own creative This provides a perspective of interactive music
practice. Their work is presented here as a foundation systems that focuses on the shared creative aspect of
for discussions pertaining to the definition, classification the process in which the computer influences the
and modelling of interactive music systems. performer as much as the performer influences the
computer. The musical output is created as a direct
result of this shared interaction, the results of which
2. DEFINITIONS
are often surprising and not predicted.
2.1. Interactive composing
Chadabe has been developing his own interactive music 2.2. Interactive music systems
systems since the late 1960s and has written extensively Rowe (1993) in his book Interactive Music Systems
on the subject of composing with interactive computer presents an image of an interactive music system
music systems. In 1981 he proposed the term interactive behaving just as a trained human musician would,
composing to describe ‘a performance process wherein a listening to ‘musical’ input and responding ‘musi-
performer shares control of the music by interacting cally’. He provides the following definition:
with a musical instrument’ (Chadabe 1997: 293).1
Interactive computer music systems are those whose
1
behaviour changes in response to musical input. Such
Chadabe first proposed the term interactive composing at the responsiveness allows these systems to participate in live
International Music and Technology Conference, University of
Melbourne, Australia, 1981. From http://www.chadabe.com/bio. performances, of both notated and improvised music.
html viewed 2 March 2009. (Rowe 1993: 1)
126 Jon Drummond
In contrast to Chadabe’s perspective of a composer/ closely aligned with Rowe’s, in which the computer
performer interacting with a computer music system, listens to, interprets and then responds to a live
the combined results of which realise the composi- human performance. Winkler’s approach is also
tional structures from potentials encoded in the sys- MIDI based with all the constraints mentioned
tem, Rowe presents an image of a computer music above. Winkler describes interactive music as:
system listening to, and in turn responding to, a a music composition or improvisation where software
performer. The emphasis in Rowe’s definition is on interprets a live performance to affect music generated
the response of the system; the effect the system has or modified by computers. Usually this involves a per-
on the human performer is secondary. Furthermore former playing an instrument while a computer creates
the definition is constrained, placed explicitly within music that is in some way shaped by the performance.
the framework of musical input, improvisation, (Winkler 1998: 4)
notated score and performance.
As is the case with Rowe’s definition, there is little
Paine (2002) is also critical of Rowe’s definition
direct acknowledgment by Winkler of interactive
with its implicit limits within the language of notated
music systems that are not driven by instrumental
western art music, both improvised and performed,
performance. In discussing the types of input that can
and its inability to encompass systems that are not
be interpreted, the focus is again restricted to event-
driven by instrumental performance as input:
based parameters such as notes, dynamics, tempo,
The Rowe definition is founded on pre-existing musical rhythm and orchestration. Where gesture is men-
practice, i.e. it takes chromatic music practice, focusing tioned, the examples given are constrained to MIDI
on notes, time signatures, rhythms and the like as its controllers (key pressure, foot pedals) and computer
foundation; it does not derive from the inherent qualities mouse input.
of the nature of engagement such an ‘interactive’ system
Interactive music systems are of course not ‘found
may offer. (Paine 2002: 296)
objects’, but rather the creation of composers, perfor-
Jordà (2005) questions if there is in fact a general mers, artists and the like (through a combination of
understating of what is meant by Rowe’s concept of software, hardware and musical design). For a system
‘musical input’: to respond musically implies a system design that meets
the musical aesthetic of the system’s designer(s). For a
How should an input be, in order to be ‘musical’
enough? The trick is that Rowe is implicitly restraining system to respond conversationally, with both pre-
‘interactive music systems’ to systems which posses the dictable and unpredictable responses, likewise is a
ability to ‘listen’, a point that becomes clearer in the process inbuilt into the system. In all of the definitions
subsequent pages of his book. Therefore, in his defini- discussed, to some degree, is the notion that interactive
tion, ‘musical input’ means simply ‘music input’; as tri- systems require interaction to realise the compositional
vial and as restrictive as that! (Jordà 2005: 79) structures and potentials encoded in the system. To this
extent interactive systems make possible a way of
However, Rowe’s definition should be considered in
composing that at the same time is both performing
the context of the music technology landscape of the
and improvising.
early 1990s. At this time most of the music software
programming environments were MIDI based, with
the sonic outcomes typically rendered through the use
of external MIDI synthesisers and samplers. Real-time 3. CLASSIFICATIONS AND MODELS
synthesis, although possible, was significantly restricted 3.1. Empirical classifications
by processor speed and the cost of computing hard-
ware. Similarly, sensing solutions (both hardware and One of the simplest approaches for classifying inter-
software) for capturing performance gestures were far active music systems is with respect to the experience
less accessible and developed in terms of cost, speed and afforded by the work. For example, is the system an
resolution than are currently available. The morphol- installation intended to be performed by the general
ogy of the sound in a MIDI system is largely fixed and public or is it intended for use by the creator of the
so the musical constraints are inherited from instru- system and/or other professional artists? Bongers (2000:
mental music (i.e., pitch, velocity and duration). Thus 128) proposes just such an empirically based classifica-
the notions of an evolving morphology of sound tion system, identifying the following three categories:2
explored through gestural interpretation and interac- (1) performer with system; Bongers (200) empirical
tion are not intrinsic to the system. (2) audience with system; and classification
(3) performer with system with audience.
2.3. Composing interactive music
2
Of course, there is always some form of interaction between the
Winkler (1998) in his book Composing Interactive performer and audience; however, in this instance the focus is on
Music presents a definition of interactive music systems the interactions mediated by an electronic system only.
Understanding Interactive Systems 127
These three categories capture the broad form and Musician Score Follower Accompaniment
function of an interactive system but do not take into
account the underlying algorithms, processes and qual-
ities of the interactions taking place. The performer with
system category encompasses works such as Lewis’
Voyager (2000), Waisvisz’s The Hands (1985), Sonami’s
Figure 1. Model of a score-following system, adapted from
Lady’s Glove (Bongers 2000: 134) and Schiemer’s
Orio, Lemouton and Schwarz 2003.
Spectral Dance (1999: 110). The audience with system
category includes interactive works designed for gallery
installation such as Paine’s Gestation (2007), Gibson For Rowe, these classification dimensions do not
and Richards’ Bystander (Richards 2006) and Tanaka represent distinct classes; instead, a specific inter-
and Toeplitz’s The Global String (Bongers 2000: 136). active system would more than likely encompass
Bongers’ third category, performer with system with some combination of the classification attributes.
audience places the interactive system at the centre, with Furthermore, the dimensions described should be
both performer and audience interacting with the considered as points near the extremes of a con-
system. Examples of this paradigm are less common, tinuum of possibilities (Rowe 1993: 6).
but Bongers puts forward his own – The Interactorium
(Bongers 1999), developed together with Fabeck and 3.2.1. Score-Driven vs. Performance-Driven
Harris – as an illustration. The Interactorium includes
Score-driven systems have embedded knowledge of
both performers and audience members in the interac-
the overall predefined compositional structure. A per-
tion, with the audience seated on chairs equipped with embedded
former’s progress through the composition can be
‘active cushions’ providing ‘tactual’ feedback experi- knowledge:
tracked by the system in real-time, accommodating
ences and sensors so that audience members can interact
subtle performance variations such as a variation in conhecimento
with the projected sound and visuals and the performers.
tempo. Precise, temporally defined events can be trig- embutido
To this list of classifications I would add the fol-
gered and played by the system in synchronisation with
lowing two extensions:
the performer, accommodating their performance
Drummond (4) multiple performers with a single interactive nuances, interpretations and potential inaccuracies.
additional system; and A clear example of a score-driven system is demon-
Classifications
(5) multiple systems interacting with each other and/ strated by score following (Dannenburg 1984; Vercoe
or multiple performers. 1984)3 in which a computer follows a live performer’s
progress through a pre-determined score, responding
Computer interactive networked ensembles such as
accordingly (figure 1). Examples of score-following
The Hub (Brown and Bischoff 2002), austraLYSIS
works include Lippe’s Music for Clarinet and ISPW
electroband (Dean 2003) and HyperSense Complex
(1993) and Manoury’s Pluton for piano and triggered
(Riddell 2005) are examples of multiple performers
signal processing events (Puckette and Lippe 1992).
with a single interactive system, exploring interactive
Score following is, however, more reactive than
possibilities quite distinct from the single performer
interactive, with the computer system typically pro-
and system paradigm. In a similar manner the sepa-
grammed to follow the performer faithfully. Score
rate category for multiple systems interacting encom-
following can be considered as an intelligent version of
passes works such as Hess’s Moving Sound Creatures
the instrument and tape model, in which the performer
(Chadabe 1997) for twenty-four independent moving
follows and plays along with a pre-constructed tape (or
sound robots, which is predicated on evolving inter-
audio CD) part. Computer-based score-following
robot communication, leading to artificial-life-like
reverses the paradigm, with the computer following the
development of sonic outcomes.
performer. Although such systems extend the possi-
bilities of the tape model, enabling real-time signal
processing of the performer’s instrument, algorithmic
3.2. Classification dimensions
transformation and generation of new material, the
Developing a framework further than just simply result from an interactive perspective is much the same,
categorising the physical manifestations of interactive perhaps just easier for the performer to play along
systems, Rowe (1993: 6–7) proposes a ‘rough classi- with. As Jordà observes,
fication system’ for interactive music systems con-
score-followers constitute a perfect example for intelli-
sisting of a combination of three dimensions – gent but zero interactive music systems. (Jordà 2005: 85)
(1) score-driven vs. performance-driven systems;
(2) transformative, generative or sequenced response 3
Score following was first presented at the 1984 International
methods; and Computer Music Conference independently by Barry Vercoe and
(3) Instrument vs. player paradigms. Roger Dannenburg (Puckette and Lippe 1992).
128 Jon Drummond
conversely = inversamente
Acknowledge = reconhecer
Acknowlegdement = reconhecimento
Understanding Interactive Systems 131
Computer
Human
Senses Actuators
Interaction
Memory Memory
and and
Cognition Cognition
Effectors Sensors
Figure 4. Solo performer and interactive system – control and feedback, adapted from Bongers 2000.
synthesis parameters at the same time. One-to-many between gestural input and sonification. However,
mappings can solve many of the performance interactive systems can also be thought of in terms of
interface problems created by multiple one-to-one interactive composition, collaborative environments
mappings. Many-to-one mappings, also referred to as and conversational models.
convergent mapping (Hunt and Kirk 2000: 7), com- Interactive systems enable compositional structures
bine two or more outputs to control one input, for to be realised through performance and improvisa-
example a single synthesis parameter under the con- tion, with the composition encoded in the system as
trol of multiple inputs. Many-to-many is a combina- processes and algorithms, mappings and synthesis
tion of the different mapping types (Lazzetta 2000). routines. In this way all aspects of the composition –
pitch, rhythm, timbre, form – have the potential to be
derived through an integrated and coherent process,
4.5. Separating the interface from the sound
realised through interacting with the system. The
generator
performance becomes an act of selecting potentials
Mapping arbitrary interfaces to likewise arbitrarily and responding to evolving relationships. The process
chosen sound-generating devices creates the potential of composition then becomes distributed between the
for the interrelated physical and acoustical connec- decisions made during system development and those
tions between an instrument’s interface and its sound made in the moment of the performance. There is no
output – which are typically inherent in traditional pre-ordained work, simply a process of creation,
acoustic instruments – to be lost. For traditional shared with the public in performance.
acoustic instruments the sound-generating process
dictates the instrument’s design. The method of per-
forming the instrument – blowing, bowing, striking – REFERENCES
is inseparably linked to the sound-generating process Berdahl, E., Steiner, H. and Oldham, C. 2008. Practical
– wind, string, membrane. In the case of electronic Hardware and Algorithms for Creating Haptic Musi-
instruments this relationship between performance cal Instruments. Proceedings of the 2008 International
interface and sound production is no longer con- Conference on New Interfaces for Musical Expression
strained in this manner (Bongers 2000: 126). Sensing (NIME–08). Genova, Italy, 61–6.
technology and networked communication methods Birnbaum, D., Fiebrink, R., Malloch, J. and Wanderley,
M. M. 2005. Towards a Dimension Space for Musi-
such as Open Sound Control (Wright, Freed and
cal Devices. Proceedings of the 2005 International
Momeni 2003) allow virtually any input from the real Conference on New Interfaces for Musical Expression
world to be used as a control signal for use with (NIME–05). Vancouver, Canada, 192–5.
digital media. The challenge facing the designers of Blaine, T. 2006. New Music for the Masses. Adobe Design
interactive instruments and sound installations is to Center, Think Tank Online. http://www.adobe.com/
create convincing mapping metaphors, balancing designcenter/thinktank/ttap_music (accessed 6 February
responsiveness, control and repeatability with varia- 2009).
bility, complexity and the serendipitous. Blaine, T. and Fels, S. 2003. Contexts of Collaborative
Musical Experiences. Proceedings of the 2003 Conference
on New Interfaces for Musical Expression (NIME–03).
5. SUMMARY Montreal, Canada.
Blaine, T. and Perkis, T. 2000. Jam-O-Drum, a Study in
This article has discussed the differing approaches
Interaction Design. Proceedings of the 2000 Association for
taken to the definition, classification and modelling
Computing Machinery Conference on Designing Interactive
of interactive music systems encompassing both his- Systems (ACM DIS 2000). New York: ACM Press.
torical and contemporary practice. The interactive Bongers, B. 1998. An Interview with Sensorband. Computer
compositional possibilities explored by early practi- Music Journal 22(1): 13–24.
tioners still resonate today, for example – the concept Bongers, B. 1999. Exploring Novel Ways of Interaction in
of shared control, intelligent instruments, collabora- Musical Performance. Proceedings of the 1999 Creativity
tive conversational environments, and the blurring of & Cognition Conference. Loughborough, UK, 76–81.
the distinctions between instrument building, perfor- Bongers, B. 2000. Physical Interfaces in the Electronic
mance, improvisation and composition. The term Arts – Interaction Theory and Interfacing Techniques
interactive is applied widely in the field of new media for Real-Time Performance. In M. M. Wanderley and
M. Battier (eds.) Trends in Gestural Control of Music.
arts, from systems exploiting relatively straight-
Paris: IRCAM–Centre Pompidou.
forward reactive mappings of input-to-sonification
Brown, C. and Bischoff, J. 2002. Indigenous to the
through to highly complex systems that are capable Net: Early Network Music Bands in the San Francisco
of learning and can behave in autonomous, organic Bay Area. http://crossfade.walkerart.org/brownbischoff/
and intuitive ways. There has also been a recent focus IndigenoustotheNetPrint.html (accessed 6 February 09).
on describing interactive systems in terms of digi- Chadabe, J. 1997. Electric Sound: The Past and Promise of
tal musical instruments, concentrating on mappings Electronic Music. Upper Saddle River, NJ: Prentice Hall.
Understanding Interactive Systems 133
Chadabe, J. 2005. The Meaning of Interaction, a Public Patten, J., Recht, B. and Ishii, H. 2002. Audiopad: A Tag-
Talk Given at the Workshop in Interactive Systems in Based Interface for Musical Performance. Proceedings
Performance (Wisp). Proceedings of the 2005 HCSNet of the 2002 International Conference on New Musical
Conference. Macquarie University, Sydney, Australia. Interfaces for Music Expression (NIME–02). Dublin,
Dannenburg, R. B. 1984. An On-Line Algorithm for Real- Ireland.
Time AccompanimentProceedings of the 1984 International Perkis, T. 1999. The Hub, an Article Written for Electro-
Computer Music Conference (ICMC–84). Paris, France: nic Musician Magazine. http://www.perkis.com/wpc/w_
International Computer Music Association, 193–8. hubem.html (accessed 6 February 2009).
Dean, R. T. 2003. Hyperimprovisation: Computer-Interactive Piringer, J. 2001. Elektronische Musik und Interaktivität:
Sound Improvisations. Middleton, WI: A-R Editions. Prinzipien, Konzepte, Anwendungen. Master’s thesis,
Eigenfeldt, A. and Kapur, A. 2008. An Agent-based Sys- Technical University of Vienna.
tem for Robotic Musical Performance. Proceedings of Pressing, J. 1990. Cybernetic Issues in Interactive Perfor-
the 2008 International Conference on New Interfaces for mance Systems. Computer Music Journal 14(2): 12–25.
Musical Expression (NIME–08). Genova, Italy, 144–9. Puckette, M. and Lippe, C. 1992. Score Following in
Goina, M. and Polotti, P. 2008. Elementary Gestalts for Practice. Proceedings of the 1992 International Computer
Gesture Sonification. Proceedings of the 2008 Inter- Music Conference (ICMC92). San Francisco: Interna-
national Conference on New Interfaces for Musical tional Computer Music Association, 182–5.
Expression (NIME–08). Genova, Italy, 150–3. Richards, K. 2006. Report: Life after Wartime: A Suite of
Hunt, A. and Kirk, R. 2000. Mapping Strategies for Multimedia Artworks. Canadian Journal of Commu-
Musical Performance. In M. M. Wanderley and M. nication 31(2): 447–59.
Battier (eds.) Trends in Gestural Control of Music. Paris: Riddell, A. 2005. Hypersense Complex: An Inter-
IRCAM–Centre Pompidou. active Ensemble. Proceedings of the 2005 Australasian
Jordà, S. 2005. Digital Lutherie: Crafting Musical Compu- Computer Music Conference. Queensland University of
ters for New Musics’ Performance and Improvisation. Technology, Brisbane: Australasian Computer Music
PhD dissertaion, Universitat Pompeu Fabra, Barcelona. Association, 123–7.
Kaltenbrunner, M., Jordà, S., Geiger, G. and Alonso M. Rowe, R. 1993. Interactive Music Systems: Machine
2006. The Reactable*: A Collaborative Musical Instru- Listening and Composing. Cambridge, MA: The MIT
ment. Proceedings of the 2006 Workshop on ‘Tangible Press.
Interaction in Collaborative Environments’ (TICE), at Schiemer, G. 1999. Improvising Machines: Spectral Dance
the 15th International IEEE Workshops on Enabling and Token Objects. Leonardo Music Journal 9(1):
Technologies (WETICE 2006). Manchester, UK. 107–14.
Lazzetta, F. 2000. Meaning in Musical Gesture. In M. M. Spiegel, L. 1987. A Short History of Intelligent Instru-
Wanderley and M. Battier (eds.) Trends in Gestural ments. Computer Music Journal 11(3): 7–9.
Control of Music. Paris: IRCAM–Centre Pompidou. Spiegel, L. 1992. Performing with Active Instruments – an
Lewis, G. E. 2000. Too Many Notes: Computers, Com- Alternative to a Standard Taxonomy for Electronic and
plexity and Culture in Voyager. Leonardo Music Journal Computer Instruments. Computer Music Journal 16(3):
10: 33–39. 5–6.
Lippe, C. 1993. A Composition for Clarinet and Real-Time Stelarc 1996. Stelarc. http://www.stelarc.va.com.au (accessed
Signal Processing: Using Max on the IRCAM Signal 6 February 2009).
Processing Workstation. Proceedings of the 1993 10th Tarabella, L. 2004. Handel, a Free-Hands Gesture Recog-
Italian Colloquium on Computer Music. Milan, 428–32. nition System. Proceedings of the 2004 Second Inter-
Machover, T. and Chung, J. 1989. Hyperinstruments: national Symposium Computer Music Modeling and
Musically Intelligent and Interactive Performance and Retrieval (CMMR 2004). Esbjerg, Denmark: Springer
Creativity Systems. Proceedings of the 1989 International Berlin/Heidelberg, 139–48.
Computer Music Conference (ICMC89). San Francisco: Vercoe, B. 1984. The Synthetic Performer in the Context of
International Computer Music Association, 186–7. Live Performance. Proceedings of the 1984 International
Miranda, E. R. and Wanderley, M. 2006. New Digital Computer Music Conference (ICMC84). Paris, France:
Musical Instruments: Control and Interaction Beyond the International Computer Music Association, 199–200.
Keyboard. Middleton, WI: A-R Editions. Waisvisz, M. 1985. The Hands, a Set of Remote Midi-
Mumma, G. 1967. Creative Aspects of Live Electro- Controllers. Proceedings of the 1985 International
nic Music Technology. http://www.brainwashed.com/ Computer Music Conference. San Francisco, CA: Inter-
mumma/creative.html (accessed 6 February 09). national Computer Music Association, 86–9.
Orio, N., Lemouton, S. and Schwarz, D. 2003. Score Fol- Wanderley, M. M. 2001. Gestural Control of Music.
lowing: State of the Art and New Developments. Pro- Proceedings of the 2001 International Workshop –
ceedings of the 2003 Conference on New Interfaces for Human Supervision and Control in Engineering and
Musical Expression (NIME–03). Montreal, Canada. Music. Kassel, Germany.
Paine, G. 2002. Interactivity, Where to from Here? Orga- Winkler, T. 1998. Composing Interactive Music: Techniques
nised Sound 7(3): 295–304. and Ideas Using Max. Cambridge, MA: The MIT Press.
Paine, G. 2007. Sonic Immersion: Interactive Engagement Wright, M., Freed, A. and Momeni, A. 2003. Open Sound
in Real-Time Immersive Environments. Scan: Journal of Control: State of the Art 2003. Proceedings of the 2003
Media Arts Culture 4(1). http://scan.net.au/scan/journal/ International Conference on New Interfaces for Musical
display.php?journal_id590 (accessed 6 February 2009). Expression (NIME–03). Montreal, Quebec, Canada.