Professional Documents
Culture Documents
To cite this article: Andy Hunt , Marcelo M. Wanderley & Matthew Paradis (2003) The
Importance of Parameter Mapping in Electronic Instrument Design, Journal of New Music
Research, 32:4, 429-440
Download by: [University of California, Berkeley] Date: 08 February 2016, At: 17:55
Journal of New Music Research 0929-8215/03/3204-429$16.00
2003, Vol. 32, No. 4, pp. 429–440 © Swets & Zeitlinger
Abstract
This paper presents a review of a series of experiments which is changed. Moreover, the emotional response elicited from
have contributed towards the understanding of the mapping the performer is shown to be determined to a great degree by
layer in electronic instruments. It challenges the assumption the mapping. Whereas the input devices establish the physi-
that an electronic instrument consists solely of an interface cality of the system and the synthesis methods govern the
and a sound generator. It emphasises the importance of the sound quality, the mapping somehow affects how the player
mapping between input parameters and sound parameters, reacts psychologically and musically to the instrument.
and suggests that this can define the very essence of an
instrument. The terms involved with mapping are defined,
and existing literature reviewed and summarised. A model 2. The importance of mapping
for understanding the design of such mapping strategies for
In this section we emphasise the dramatic effect that the style
electronic instruments is put forward, along with a roadmap
of mapping can have on “bringing an interface to life”. We
of ongoing research focussing on the testing and evaluation
focus on our own experience in designing digital musical
of such mapping strategies.
instruments and comment on several previous designs. An
extensive review of the available literature on mapping in
computer music has been presented by the authors (Hunt,
1. Introduction: Electronic instruments and 2000; Wanderley, 2001) and (Hunt & Wanderley, 2002). A
the mapping layer special issue of the journal Organised Sound has also been
In an acoustic instrument, the playing interface is inherently recently devoted to Mapping Strategies in Computer Music
bound up with the sound source. A violin’s string is both (Wanderley, 2002) and we refer the reader to it for a varied
part of the control mechanism and the sound generator. Since review of different approaches to mapping.
they are inseparable, the connections between the two are
complex, subtle and determined by physical laws. With elec-
2.1 Informal observations
tronic and computer instruments, the situation is dramatically
different. The interface is usually a completely separate piece The first author has carried out a number of experiments into
of equipment from the sound source. This means that the mapping. The more formal of these have been presented in
relationship between them has to be defined (see Fig. 1). detail (Hunt & Kirk, 2000; Hunt, 2000), and are summarised
The art of connecting these two, traditionally inseparable, later in this paper. We begin with some rather simpler, pre-
components of a real-time musical system (an art known as viously unpublished, observations that originally sparked
mapping) is not trivial. Indeed this paper hopes to stress that interest in this subject. We have retained the first person
by altering the mapping, even keeping the interface and writing style to denote that these are informal, personal
sound source constant, the entire character of the instrument reflections.
Amp
Osc
Freq
Rate of
change Amp
Scale
Fig. 1. A generic model of the mapping problem. Osc
+
Invert Freq
2.1.1 The accidental theremin
Several years, ago I was invited to test out some final uni-
versity projects in their prototype form in the lab. One of
them was a recreation of a Theremin with modern electronic Fig. 3. Complex Mapping for Experiment 2.
circuitry. What was particularly unusual about this was that
a wiring mistake by a student meant that the “volume”
antenna only worked when your hand was moving. In other I let several test subjects play freely with the instrument,
words the sound was only heard when there was a rate-of- and talked to them afterwards. In the second experimental
change of position, rather than the traditional position-only run, the interface was re-configured to emulate the above-
control. It was unexpectedly exciting to play. The volume mentioned “accidental Theremin.” One slider needed to be
hand needed to keep moving back and forth, rather like moved in order to make sound; the rate of change of move-
bowing an invisible violin. I noted the effect that this had on ment controlled the oscillator’s amplitude. But I decided to
myself and the other impromptu players in the room. Because complicate matters (on purpose!) to study the effect that this
of the need to keep moving, it felt as if your own energy was had on the users. The pitch, which was mainly controlled by
directly responsible for the sound. When you stopped, it the first slider, operated “upside-down” to most people’s
stopped. The subtleties of the bowing movement gave a expectations (i.e., pushing the slider up lowered the pitch).
complex texture to the amplitude. We were “hooked.” It took In addition the second slider (being moved for amplitude
rather a long time to prise each person away from the instru- control) was used to mildly offset the pitch – i.e., it was cross-
ment, as it was so engaging. I returned a week later and noted coupled to the first slider (Fig. 3).
with disappointment that the “mistake” had been corrected, A remarkable consistency of reaction was noted over the
deleted from the student’s notes, and the traditional form of six volunteers who tried both configurations. With Experi-
the instrument implemented. ment 1, they all commented within seconds that they had dis-
covered how the instrument worked (almost like giving it a
mental “tick”; “yes, this is volume, and this is pitch”). They
2.1.2 Two sliders and two sound parameters
half-heartedly tried to play something for a maximum of two
The above observation caused me to think about the psy- minutes, before declaring that they had “finished.” Problem
chological effect on the human player of “engagement” with solved.
an instrument. To investigate this further, I constructed a With Experiment 2, again there was a noted consistency
simple experiment. The interface for this experiment con- of response. At first there were grumbles. “What on earth is
sisted of two sliders on a MIDI module, and the sound source this doing?” “Hey – this is affecting the pitch” (implied cries
was a single oscillator with amplitude and frequency con- of “unfair,” “foul play”). But they all struggled with it – inter-
trols. In the first run of the experiment the mapping was estingly for several more minutes than the total time they
simply one-to-one, i.e., one slider directly controlled the spent on Experiment 1. After a while, their bodies started to
volume, and the other directly controlled the pitch (Fig. 2). move, as they developed ways of offsetting one slider against
Mapping in electronic instruments 431
Mouse cursor
On-screen on a blank
sliders: one screen
for each
parameter Sliders on a
MIDI fader Mouse
box
Mouse
On-screen
sliders: one
for each
parameter
Score
Sliders on a
MIDI fader
box
meter simultaneously. Surprisingly perhaps, they were users’ (subjective) views on the comparative expressivity
nearly all extremely frustrated by the four physical sliders. and learnability of each mapping, and the quality of musical
Comments abounded such as “I should be able to do this, control that could be achieved.
technically, but I can’t get my mind to split up the sound into Whilst users were experimenting with these instruments
these four finger controls.” Some users actually got quite an interesting trend was observed. The first (one-to-one) test
angry with the interface and with themselves. The multi- did not seem to hold the users’ attention. They treated it as a
parametric interface, on the other hand, was warmly received “task” to complete rather than an opportunity to experiment
– although not at the very beginning. At first it seemed a with a musical instrument. The second test received a more
difficult interface for most users, but they rapidly warmed favorable response. The third test seemed to inspire the users
to the fact that they could use complex gestural motions to to experiment and explore the interface to the point of com-
control several simultaneous parameters without having to posing and performing their own short melodies and musical
“de-code” them into individual streams. Many users re- gestures.
marked how “like an instrument” it was, or how “expressive” Users were asked to fill in a questionnaire relating to their
they felt they could be with it, but that it needed practice. feelings towards each experiment. The questionnaire asked
Downloaded by [University of California, Berkeley] at 17:55 08 February 2016
Fig. 10. The three mappings used in the clarinet simulation presented in Rovan (1997).
pared to acoustic instruments such as the clarinet or saxo- pendently in the MIDI controller, whereas they are cross-
phone. A common path to solving this problem involves coupled in acoustic single-reed instruments. This natural
improving the design of the controller by adding extra cross-coupling is the result of the physical behaviour of the
sensors. Rovan et al. (1997) discussed the fact that by alter- reed, and variables were simply independent since the equiv-
ing the mapping layer in a digital musical instrument and alent “reed” in the controller was a plastic piece that did not
keeping the interface (an off-the-shelf MIDI controller) and vibrate, and moreover, was not coupled to an air column.
sound source unchanged, the essential quality of the instru- Based on these decisions and facts, the authors proposed
ment is changed regarding its control and expressive different mappings between the WX7 variables and the syn-
capabilities. Previous studies, notably by Buxton (1986a), thesis parameters. The first was basically a one-to-one rela-
presented evidence that input devices with similar character- tionship, where variables were independent. The second was
istics (e.g., number of degrees of freedom) could lead to very a model where the “virtual airflow” through the reed (loud-
different application situations depending on the way these ness) was a function of both the breath and lip pressure
characteristics were arranged in the device. In that study, (embouchure), as in an acoustic instrument. The third was a
however, the devices were not the same mechanically (one model that took into account both the “virtual airflow” and
had two separate one-dimensional controllers and the other the relationship between spectrum content to breath and
one two-dimensional controller), so the situation is different embouchure; a model that would match even more closely
to using exactly the same input devices with different the real behaviour of the acoustic instrument.
mapping strategies. The two most important consequences of this work were:
Another point became clear in this process; even if
the WX7 was a reasonably faithful model of a saxophone 1. By just changing the mapping layer between the controller
providing the same types of control variables (breath, lip and the synthesis algorithm, it was indeed possible to
pressure and fingering), these variables worked totally inde- completely change the instrumental behaviour and thus
Mapping in electronic instruments 435
the instrument’s feel to the performer. Depending on the acting in a unified manner. It is also associated with feelings
performer’s previous experience and expectations, differ- of success and being uplifted – which certainly concurs with
ent mappings were preferred. the experiences of the users of the multiparametric interface.
2. By deconstructing the way that the reed actually works, it This contrasts greatly with the anger and frustration expe-
was noted that the choice of mapping could be important rienced by users of the parallel sliders interface. This may
as a pedagogical variable. Indeed, in stark contrast with come as a surprise to some, especially considering that such
acoustic instruments where the dependencies between banks of physical sliders are known the world over as the
parameters are unchangeable, cross-coupling between basis of the mixing desk. Fitzmaurice et al. (1997) conclude
variables can easily be created or destroyed in digital that slider banks (and other space-multiplexed graspable
musical instruments. This means that performers could interfaces) outperform time-multiplexed interfaces, such as
focus on specific aspects of the instrument by explicitly the mouse. Does our work contradict their findings? No,
defining its behaviour. but it does qualify them. One possible interpretation of
Fitzmaurice et al. is that we should abandon the mouse and
provide separate graspable interfaces, but our results show
Downloaded by [University of California, Berkeley] at 17:55 08 February 2016
in the other sense: for the same controller and the same set
of parameters, multiple synthesis techniques could be used
by just adapting the second mapping layer, the first being Controller Parameters
held constant. Specifically in this case, the choice of synthe-
sis algorithm is transparent for the user.
The two-layered model has been expanded to include
three mapping layers (Hunt & Wanderley, 2002; Arfib et al.,
2002). Figure 12 depicts a model containing these three
mapping layers.
These works support the idea that, by using multi-
layered mappings, one can obtain a level of flexibility in Meaningful Parameters
the design of instruments and that moreover, these models
can indeed accommodate the control of different media, such
as sound and video, in a coherent way. Section 5.1 describes
Downloaded by [University of California, Berkeley] at 17:55 08 February 2016
ments for the construction of such mappings by answering collection and analysis of data over time relating to the accu-
the following questions. racy, speed, expressive control and suitability of interfaces
for specific tasks.
How do we define the mapping layer?
How can we measure the performance of a
In order to fully understand the workings of the mapping
mapping strategy?
layer it must firstly be fully defined. This will give researchers
a clear structure and vocabulary with which to refer to ele- How can the subjective data taken from a user’s experience
ments of a system. Current research concentrates on the of an interface be represented with actual performance data
mapping layer as a high level entity. Strategies are defined (data that can be recorded whilst a user is carrying out a spe-
(one-to-one etc.) along with layers which can carry out spe- cific task)? Can trends be observed in objective data to rep-
cific functions. It is proposed that a mapping layer should be resent Expressivity, or the enjoyment that a user experiences
studied at a much lower level in order to fully understand the in their interaction?
implications of a given parameter configuration. In all interface design the end result is ultimately a user
Downloaded by [University of California, Berkeley] at 17:55 08 February 2016
interface effectiveness. We experimented with a multi-layer Hunt, A. (2000). Radical user interfaces for real-time musical
model to help designers to implement complex but usable control. DPhil thesis, University of York, UK. Available:
mappings, and are carrying out a programme of research to http://www-users.york.ac.uk/~elec18/download/adh_thesis/
look into the testing and evaluation of mapping strategies. Hunt, A., & Wanderley, M.M. (Eds.) (2000). Mapping of
We are still in the early stages of understanding the com- control variables to musical variables. Interactive systems
plexities of how the mapping layer affects the perception (and and instrument design in music working group. Website:
the playability) of an electronic instrument by its performer. www.igmusic.org
What we know is that it is a very important layer, and Hunt, A., & Kirk, R. (2000). Mapping strategies for musical per-
one that must not be overlooked by the designers of new formance. In: M. Wanderley, & M. Battier (Eds.), Trends in
instruments. Gestural Control of Music. IRCAM – Centre Pompidou.
Hunt, A., Wanderley, M.M., & Kirk, R. (2000). Towards a model
for instrumental mapping in expert musical interaction. In
Proc. of the 2000 International Computer Music Conference.
Acknowledgements San Francisco, CA: International Computer Music Associa-
Downloaded by [University of California, Berkeley] at 17:55 08 February 2016
Many thanks are due to the following people for their tion, pp. 209–211.
collaboration in the work described in this paper: Ross Hunt, A., & Wanderley, M.M. (2002). Mapping performance
Kirk, Butch Rovan, Shlomo Dubnov, Philippe Depalle, parameters to synthesis engines. Organised Sound, 7,
and Norbert Schnell. 103–114.
Garnett, G., & Goudeseune, C. (1999). Performance factors in
control of high-dimensional spaces. Proc. 1999 International
Computer Music Conference. San Francisco, CA: Interna-
References tional Computer Music Association, pp. 268–271.
Arfib, D., Couturier J., Kessous L., & Verfaille V. (2002). Strate- Goudeseune, C. (2002). Interpolated mappings for musical
gies of mapping between gesture data and synthesis model instruments, Organised Sound, 7, 85–96.
parameters using perceptual spaces, Organised Sound, 7, Jacob, R., Sibert, L., McFarlane, D., & Mullen, M. (1994). Inte-
127–144. grality and separability of input devices. ACM Transactions
Burzik, A. (2002). A Holistic Method of Practice for all Instru- on Computer-Human Interaction, 1, 3–26.
mentalists. Available: http://www.practising-in-flow.de/ Last Kabbash, P., Buxton, B., & Sellen, A. (1994). Two handed input
visited on 10/04/03. in a compound task. Proc SIGCHI Conf. Human Factors in
Buxton, W. (1986a). There’s more interaction than meets the eye: computing systems, pp. 417–423.
Some issues in manual input. In: D. Norman, & S.W. Draper Métois, E. (1996). Musical Sound information: Musical gestures
(Eds.), User Centered System Design: New Perspectives on and embedding systems. PhD Thesis. MIT Media Lab.
Human-Computer Interaction. Hillsdale, N.J.: Lawrence Mulder, A., Fels, S., & Mase, K. (1997). Empty-handed gesture
Erlbaum Associates, pp. 319–337. analysis in Max/FTS. In Kansei, The Technology of Emotion.
Buxton, W. (1986b). Chuncking and phrasing and the design of Proceedings of the AIMI International Workshop, A. Camurri
human-computer dialogues. In: H.J. Kugler (Ed.), Informa- (Ed.), Genoa: Associazione di Informatica Musicale Italiana,
tion Processing ’86, Proceedings of the IFIP 10th World October 3–4, pp. 87–91.
Computer Congress, North Holland, pp. 475–480. Mulder, A., Fels, S., & Mase, K. (1997). Mapping virtual object
Cadoz, C. (1999) Continuum energetique du geste au son. Sim- manipulation to sound variation. In: T. Rai, & R. Basset
ulation multisensorielle interactive d’objets physiques. In: (Eds.), IPSJSIG notes, 97, 63–68.
H. Vinet, & F. Delalande (Eds.), Interfaces Homme-Machine Mulder, A., & Fels, S. (1998). Sound sculpting: Manipulating
et Création Musicale. Paris: Hermès Science Publishing, sound through virtual sculpting. Proc. 1998 Western Com-
pp. 165–181. puter Graphics Symposium, pp. 15–23.
Cadoz, C., & Wanderley, M. (2000) Gesture-Music. In: M. Rovan, J.B., Wanderley, M.M., Dubnov, S., & Depalle, P. (1997).
Wanderley, & M. Battier (Eds.), Trends in Gestural Control Instrumental gestural mapping strategies as expressivity
of Music. Paris: Ircam – Centre Pompidou, pp. 71–93. determinants in computer music performance. In Kansei,
Csikszentmihalyi, M. (1975). Beyond boredom and anxiety: The Technology of Emotion. Proceedings of the AIMI
Experiencing flow in work and play. (reprint, Jossey Bass International Workshop, A. Camurri (Ed.), Genoa: Associ-
Wiley; 2000 ISBN: 0787951404). azione di Informatica Musicale Italiana, October 3–4, pp.
Fitzmaurice, G., & Buxton, B. (1997). An empirical evalua- 68–73.
tion of graspable user interfaces: Towards specialized, space- Simon, H. (1996). The Sciences of the artificial. MIT Press; 3rd
multiplexed input. Proc. ACHCHI 1997 pp. 43–50. edition, ISBN 0262691914.
http://www.acm.org/sigchi/chi97/proceedings/paper/gf.htm Wanderley, M.M., Schnell, N., & Rovan, J. (1998). Escher –
Last visited on 10/04/03. Modeling and performing “composed instruments” in
Guiard, Y. (1987) Assymetric division of labor in human skilled real-time. Proc. IEEE International Conference on Systems,
bimanual action: The kinematic chain as a model. Journal of Man and Cybernetics (SMC’98), San Diego, CA, pp.
Motor Behaviour, 19, 486–517. 1080–1084.
440 Andy Hunt et al.
Wanderley, M.M., & Depalle, P. (1999). Contrôle gestuel de la Wanderley, M.M. (2001). Performer-instrument interaction.
synthèse sonore. In: H. Vinet, & F. Delalande (Eds.), Inter- Application to gestural control of sound synthesis. PhD
faces Homme-Machine et Creation Musicale – Hermes Thesis. University Paris VI, France.
Science Publishing, pp. 145–163. Wanderley, M.M. (Ed.) (2002). Mapping strategies for real-time
Wanderley, M.M. (Ed.) (2000). Interactive systems and instru- computer music. Organised Sound, 7.
ment design in music workgroup. Website: www.igmusic.org Wessel, D. (1979). Timbre space as a musical control structure.
Last visited on 10/04/03. Computer Music Journal, 3, 45–52.
Downloaded by [University of California, Berkeley] at 17:55 08 February 2016