You are on page 1of 10

Seeking Out the Spaces Between: Using Improvisation in Collaborative Composition with

Interactive Technology
Author(s): Sarah Nicolls
Source: Leonardo Music Journal, Vol. 20, Improvisation (2010), pp. 47-55
Published by: The MIT Press
Stable URL: https://www.jstor.org/stable/40926373
Accessed: 04-02-2019 14:36 UTC

REFERENCES
Linked references are available on JSTOR for this article:
https://www.jstor.org/stable/40926373?seq=1&cid=pdf-reference#references_tab_contents
You may need to log in to JSTOR to access the linked references.

JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide
range of content in a trusted digital archive. We use information technology and tools to increase productivity and
facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.

Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
https://about.jstor.org/terms

The MIT Press is collaborating with JSTOR to digitize, preserve and extend access to
Leonardo Music Journal

This content downloaded from 193.170.197.53 on Mon, 04 Feb 2019 14:36:23 UTC
All use subject to https://about.jstor.org/terms
Seeking Out the Spaces Between:
Using Improvisation in Collaborative
Composition with Interactive
Technology
ABSTRACT

I his article presents findin


from experiments into pian
and live electronics undert
by the author since early
Sarah Nicolls The use of improvisation h
infused every step of the
process- both as a metho
ogy to obtain meaningful r
using interactive technolog
as a way to generate and c
acterize a collaborative mu
Context "performance space
JL n seeking to create responsive en- with composers. The
technology used has includ
vironments" at the piano, I explore live, performative control
Many practitioners in the field of
pre-built MIDI interfaces su
of electronics to create better connections for both performer
live performance with electronics
the PianoBar, actuators su
(providing the same level of interpretive freedom
make their own interfaces or in- as with aDC motors and
miniature s
"pure" instrumental performance) andwith
struments audience
which they(communi-
interfaces including ¡Cube
impro- a
the Wii controller. Collabor
cating clearly to them) . I have been
vise; lucky to witness
this is readily first-hand
demonstrated have included researchers
many live interactive performances and to work
by communities such as with
New In-various
Centre for Digital Music (Q
empathetic composers/ performers in flexible working
terfaces for Musical Expression envi-
Richard Barrett, Pierre Ale
ronments. Collaborating with experienced (NIME) , inspired by technologists and
Tremblay
leading figures and Atau Tanaka.
musicians, I have witnessed timesuch and again
as Nicolas Collinswhat,
and Michel for me, is a
fundamental truth in interactive instrumental
Waisvisz. From what is now performance:
a hugely
As a living, spontaneous form it must be nurtured
broad field, performances I have and in-
formed by the performer's physicality and imagination
witnessed recently that seem most relevant, aseither through
much as by the creativity or knowledge of the composer and/
their use of physical drama or a particular technology, include
or technologist.
Chikashi Miyama ("Angry Sparrow"), in his dazzling, virtuosic
Specifically in the case of sensors, theirperformance
and humorous dependence on the
on a self-made interface [ 1 ] , and
detail of each person's body and reactions is so refined as to
Derek Hölzer [2] , whose optical discs are attached to spinning
necessitate, I would argue, an entirely motors andcollaborative approach
thrust under an overhead projector for instant,
and therefore one that involvesrough-and-ready at least directed improvisa-
multimedia effect.
tion and, more likely, fairly extensive Relevant improvised
sensor performancesexploration.
include Atau Tanaka (Sen-
The fundamentally personal and intimate nature of sensor
sors_Sonics_Sights) (SSS) [3] and Benjamin Knapp [4]; both
readings - the amount of tension created by each performer,
use the BioMuse sensor system, which was invented by Knapp
the shape of the ancillary gestures or the level of emotional
and Hugh Lusted [5] . Tanaka and Knapp present a fascinating
involvement (especially relevant when using galvanic skin
contrast, as they use the same system to quite opposite ends:
response or EEG) - makes creating pieces with sensors ex-
Tanaka is a highly gestural, physically active and expressive
tremely difficult for a composer to do in isolation. Improvisa-
performer, while Knapp performs seated and - using sensors
tion therefore provides a way for performer and composer to
including EEG and galvanic skin response - plays with emo-
generate a common musical and gestural language.
tional readings, generating music from a quite inward control
Related to these issues is the fact that the technical and
of his internal self. At MIT, Elena Jessop developed a beauti-
notational parameters in interactive music are not yet (and
fully intuitive glove [6], which enables her, for example, to
may never be) standardized, thereby creating a very real and
grab notes seemingly from her mouth and lengthen them by
practical need for improvisation to figure at least somewhere
pulling away from the face smoothly.
in the process.
The individuality of each of these performers only strength-
ens the case that improvisation is not only a way of generating
music but also the key to inventing and learning a host of new
instruments, interfaces or systems of interaction.

Sarah Nicolls (artist, educator), Centre for Contemporary Music Practice, School of Arts,
Physicality
Brunei University, Kingston Lane, Uxbridge, UB8 3PH, U.K. E-mail: <info@sarahnicolls.
com>. Web site: <www.sarahnicolls.com>.

See <mitpressjournals.org/lmj/-/20> and <www.sarahnicolls.com> for supplemental files


The consideration of physicality is one of the key aspects i
related to this article.
creating instrumental performances designed to give contr
Co-author on Case Study 1: Richard Barrett, Wilhelm-Stolze-Strasse 30, 10249 Berlin, of the electronics to the performer. Several texts affirm th
Germany. E-mail: <richard@furtlogic.com>.
importance of the physicality inherently learnt and absorbe
Co-authors on Case Study 4: Samer Abdallah, Kurt Jacobson, Andrew Robertson, Adam as part of instrumental study. John Richards 's article "Los
Stark and Nick Bryan-Kinns, Centre for Digital Music, Queen Mary, University of London
(QMUL), Mile End Road, London El 4NS, U.K. E-mail: c/o <adam.stark@elec.qmul. and Found" [7] has an array of excellent quotations, the mo
ac.uk>. Web: <http://www.eecs.qmul.ac.uk/~nickbk>.
succinct of which is Bob Ostertag's Human Bodies, Compute

©2010 ISAST
LEONARDO MUSIC JOURNAL, Vol. 20, pp. 47-55, 2010 47

This content downloaded from 193.170.197.53 on Mon, 04 Feb 2019 14:36:23 UTC
All use subject to https://about.jstor.org/terms
Music. "An intelligence and creativity is larly poetic, yet astute, point: "Implicitgesture and musical phrase, is a more or
actually written into the artist's muscle in the experienced musician's under-less complex internal representation of
and bones and blood and skin and hair" standing of the relationship between ac-the dynamics of an instrument" [9] . This
[8]. O'Modhrain and Essi make a simi- tion and sound, between performance notion that there is an imprint of the
performer's instrument within the per-
former's body is vital when considering
why improvisation might be necessary in
generating the language and parameters
for performing interactive instrumental
music.

To improvise means at one level to fol-


low one's instinctive urges, the internal
reactions and responses that could be
referred to as pre-analysis in the perform-
er's own cognitive process. If this is then
paired with a physical internal awareness
of the capabilities of one's setup or in-
strument, then the responses will logi-
cally be faster and more innate, intuitive
and highly responsive than if one or an-
other is non-instinctive. If the performer
can intuitively know the edges of physi-
cal possibility for the sensors - where the
highest and lowest readings are found
for example - then the manipulation of
these will be managed most deftly.
Instrumentalists have finely tuned sys-
tems of tactile or physical feedback (both
external, when touching keys etc., and
internal - knowing when or how to re-
lax when playing fast, for example) and
in working with interactive technology,
muscle memory gets built up in a similar
way. Also crucial to this discussion: When
creating new composed pieces with an in-
teractive setup, to have the performer
improvise with the technology means to
unlock this inner physical language, to
find both what is possible and natural
and also what is unnatural, or outside of
the natural body language: "the spaces
between pianism" in my case. This then al-
lows for the fundamental aesthetic judg-

Fig. 1. Examples of Tremblay's score. (© Pierre Alexandre Tremblay)

48 Nicolls, Seeking Out the Spaces Between

This content downloaded from 193.170.197.53 on Mon, 04 Feb 2019 14:36:23 UTC
All use subject to https://about.jstor.org/terms
Fig. 2. Tremblay's schema. (© Pierre Alexandre Tremblay)

ment of whether to make the interactive found that I would focus on playing the so that the result gradually diverged in
control in addition to, or part of, the sensors,
in- thereby turning the previously pitch and timbre from the original.
strumental playing. nearly subconscious movement into a Our pre-made parts (my part: per-
How the use of physicality may change material action. As a solo performer is forming the Lost score, and Barrett's
only one body, one mind, these cycles
the original gesture also needs consider- part: playing the new recorded version)
ation. The writings of Wanderley and of complexity and confusion perhaps were now a basis for Adrift, a. consistent
Cadoz [10] on this topic are well known;
begin to disrupt the artistic spontaneity continuation (into the real time of per-
and intuitive physical sense, potentially
their discussion is furthered by Wander- formance) of the compositional process
undermining
ley and Miranda's extensive 2006 study of the original meaning of that gives Lost its particular structure:
new instruments: the gesture. taking basic material and interpolating
more and more inserts into it until the
The instrumental gesture ... is ap- original material becomes almost liter-
plied to a concrete (material) object Case Study 1
ally lost in its own extrapolations, distor-
with which there is physical interac-
tion; specific (physical) phenomena are Richard Barrett's Adrift (2007) was
tions, reflections, etc. Either performer
produced during a physical interaction commissioned as part of my first
in Arts
Adrift could interrupt her/his given
whose forms and dynamics can be mas- and Humanities Research Councilpart at any time and interpolate an im-
tered by the subject. These phenomena (AHRC)-funded project in provised
2007 [12],passage before continuing from
may become the support for communi-
cational messages and/or be the basis for which sought to increase thethe repertoire
same point where he/she left off (as
the production of a material action [11]. for piano and live electronics. I hada al-
if using pause button).
ready performed Barrett's Lost,Whatand was
this fascinating was how, having
The consideration of how adding a became the foundation for Adriftingested [13].
Lost through hours of prac-
sensor to a pianist's arm may affect both Essentially, Adrift amplified tice,
LostI into
foundamyself quite naturally and
the pianist's and the audience's relation- subconsciously
semi-improvised duet for Barrett and me improvising in Barrett's
ships to the original semiotic function compositional
(with Barrett playing his keyboard system language. Helped by hav-
of the gesture was one of the main ques- using STEIM'S LiSa program) ing . He began in close proximity over sev-
rehearsed
tions resulting from work on Case Study the compositional process byeral recording
sessions and having witnessed several
3. Although not a central issue for this my performance of Lost and chopping
performances it by Barrett in his groups
article, I briefly illustrate the problem I FURT and
into upwards of 70 sections. These were fORCH [14], I effectively
found here. Imagine the pianist lifting reordered and gradually shiftedinternalized
in pitch,the physical language that
the arm away from the keyboard, per- accompanies
the first ones very slightly, increasing ashis music. In the live per-
haps signifying a breath between musical the piece progressed. The formance
degree [15],
of Barrett sat at the other
phrases. When using this gesture to gen- other processes (filtering, short
end of delays
the piano, facing me, in the posi-
erate data and, in turn, process sound, I and feedback) also generally tionincreased,
of a second pianist in a two-piano

Nicolls, Seeking Out the Spaces Between 49

This content downloaded from 193.170.197.53 on Mon, 04 Feb 2019 14:36:23 UTC
All use subject to https://about.jstor.org/terms
work, and the amplification was local (a Case Study 2 relevant language for the piece (Fig. 1 ) .
stereo pair was placed at either end of Pierre Alexandre Tremblay's Un clou, This process enabled us to understand
the piano). Thus, both of us performed son marteau et le béton (2008) [17] illus-
each other's perceptive interpretations
toward each other, creating an intimate of symbol, word and sound.
trates the use of improvisation in gener-
mirror image. ating material, finding a common and We had several sessions like this, grad-
Barrett's experience of improvisation ually obtaining passages that we could
genuinely cumulative language between
within the compositional process has composer and performer, and creating
easily re-create through notational short-
constantly evolved; he wrote the follow- hand and that could also be understood
a long piece of music (around 22 min-
ing to me after our collaboration: by someone else coming fresh to the
utes), which is rigorously composed,
yet with only approximately 36 bars score.
of Sections of the piece used system-
The more I attempt to define what im-
music written on a stave. I had heard atic processes, thereby limiting the need
provisation is, the more it seems to slip
through the fingers; if, for example, it La Rage, Tremblay's 50-minute for notation; one example is a section
suite
is defined as those aspects of musical for free-jazz drummer and electronics,in which the computer and I built up
creation which are spontaneous or un- an intensifying call and response, with
showing Tremblay's ability to frame a
planned, you run into difficulties. So in-
stead, I prefer to think of "composition" multi-dimensional performer/machine the computer taking my notes and re-
as defining the act of bringing music interaction, combining composition ordering
and and speeding them up, and I
into being, and "improvisation" as one improvisation. Tremblay and I used then
im-imitating its rhythmic profile with
element among various means by which new pitches. Other sections used a bias
provisation from the outset, improvising
that might be brought about. Thus, it
together at first to get to know each from
other the background to direct the im-
isn't really a matter of bringing "impro-
visation" and "composition" together, as musicians, with Tremblay on laptop provisation subliminally: for instance, a
which at first I thought it was: it is more and bass guitar. We then began therelatively
piece free section with a given fixed
a question of realizing that they aren't electronic part allows imposition of a tar-
using bare-boned notation that I impro-
really two different things [16].
vised upon to test musical gesturesget andon an improviser without explicitly
It is interesting to note that as the pia- giving musical instructions.
specific real-time processing. Tremblay
nist, I was able to move from studying also asked me to improvise freely within Tremblay then wrote the piece, using
Barrett's compositional language to im- some settings: over a fixed electroniceither audio control signals (certain au-
provising something comparable, while dible pitches or the creation or absence
part, or within some real-time process-
as composer Barrett used improvisation ing that I would subvert with myofown sound) or direct inputs (we used the
to seek out his compositional language. PianoBar - a MIDI device placed over
musical inputs. We recorded the results
This cyclical process informed the proj- the keys of the piano, reading the pitch
to use as triggers for the next session.
ect and helped to create a fine balance Tremblay created notation (mixtures and of
velocity of each key when played)
between freedom and a stylized, con- text, conventional and guided improvi-(Fig. 2) . Because the piece used different
sistent musical language. Although the sation) to show how he would interpretinput messages (for example, depress-
interpreter was relatively free, the com- what I had played, which allowed me ing to
keys or making vocal noises) to the
poser's voice in fact infused all aspects feed back on what was communicated patch at different points in the piece, it
of the music. to me, building up the most efficientfelt
or highly responsive in performance: It

Fie. 3. Atau Tanaka wearing the EMG sensors in the positions we used. (Photo © Sarah Nicolls)

50 Nicolls, Seeking Out the Spaces Between

This content downloaded from 193.170.197.53 on Mon, 04 Feb 2019 14:36:23 UTC
All use subject to https://about.jstor.org/terms
Fig. 4. Practicing for the collaboration with Centre for Digital Music. (Photo © Sarah Nicolls)

constantly shifted my attention, and the a piece that would combine my natural
with the sensors - using my arms in mid-
process thus felt akin to performing with gestural/physical/emotional approach
air to find the thresholds and to find ges-
other live musicians. Indeed, the result- to the piano - including improvisation
tures that produced the right amount of
ing piece I found to be such a detailed in the final performance - with Tanaka's
muscle tension and to begin to memorize
web of interactivity, with many subtle compositional language and detailed
where and how I needed to be to create
changes of technological response, that and practical knowledge of the sensors.
useful signals - internalizing the "instru-
it felt very much as if I were improvis- Although we used his physical perfor-ment" of the sensors.
ing with Tremblay himself: The "perfor- mance language as a basis for my learn-Returning to the piano was difficult af-
mance environment" engaged me in a ing to "play" the sensors (i.e. his gestural
ter this extended period of learning the
living, breathing way. What fascinates me shapes and tricks to create the right read-Space here is not sufficient to go
sensors.
is how strict yet supple the piece is, and into more detail about the process, but
ings), we allowed room, using improvisa-
I conclude that using improvisation as a tion, for my own performative language
after giving a work-in-progress showing
methodology enabled this commonality
to develop. to a theater-trained audience I worked
to thrive. After generating some initial motivic to restore the relationship of sensor use
and textural musical ideas through im- to pianistic gesture. The tangent of per-
provisation, I further improvised upon forming without the piano in a theatrical
Case Study 3
these while wearing the sensors to see setting however did give useful insights
Atau Tanaka's Suspensions [18]what shows
kind of data they would generate. into meanings attached to gestures and
how improvisation was used to generate
In two or three initial sessions together, also raised questions about whether or
the grammatical or theatrical language
I took the sensors and began to under- not it was desirable to reveal the tech-
for an interactive system. We usedstandone
what they did, by simply making nology [19] and how much I wanted
EMG sensor (reading electrical currents
gestures and watching their resultant the audience to understand the connec-
created by muscle contraction) on dataeach
in a patch. This soon became too tions between gesture and sound. Again,
arm (on the forearm extensor muscles)
limited, so we made a frame patch with Tanaka and Knapp serve as useful ends
(Fig. 3) and a double-axis accelerometer
which I could practice using sonic feed- of the spectrum in this case, with Knapp
on the right wrist. We set out toback.
create
From this point, I practiced purely wearing his sensors hidden beneath a suit

Nicolls, Seeking Out the Spaces Between 5 1

This content downloaded from 193.170.197.53 on Mon, 04 Feb 2019 14:36:23 UTC
All use subject to https://about.jstor.org/terms
Main Control: MIDI Foot
Pedal to Switch
Between Technologies

Granular Delay Onset Triggered Motors on Strings


Tilt Sensor
in Hat

(Worn By Audjo ,n I Controller to I


1 Performer) | fmm Audjo pjano
w Contact n ,. ° ....

Audio In y
Spavin' from Piano

I ™P 1 MZThones | | Implemented
Delay Effect . Detector Patch in I I
¡n "< .

1 Ablet°n Live 1
| Audio Out
Audio
sSnPgsC0n (Samples)
Out I piano I

Fig. 5. Schema for Case study 4 resulting piece. (© Adam Stark)

referring back to wears


jacket while Tanaka ing
earlier descriptions
his them
of in real-time - more
much and simultane-
openly, almostthe internalized
reveling imprint of the in ously
piano in- make an engaging spectacle for the
them.
audience.
For me, the side my own body)most
single that gave lifesuccessful
to this mo-
ment in the idea. This, I think, is the nub of why
proceedings took im- place when
provisation is a
improvising with suchsampled Methodology
a useful tool: It allows chord and "in my
arm" (i.e. an the
EMG performer to be responsive
sensor on Collaborative
to the my Process
arm was
mapped to amoment, to the environment
sample). Iandsat
to the
For ac-
this
at project,
the I invented
pianoPianoLab: a
and when approaching the
research
cidents and discoveries that we intuitively keyboard
space that would allow for real-
increased find.
the tension in time,
my arm.
genuinely I evolution-
collaborative, be-
gan triggering the very, ary research.
very Crucially, it placed a piano
beginnings
of the sound - much like the sounds at the heart of the research environment
Case Study 4
of breathing or bowing before a note and provided room to build electronics
In collaborating
speaks on a string or wind instrument - and for
with the Centre house several computer stations.
however, instead of then allowing
Digital As it was
the Music [20] I sought to create an the first PianoLab project, its
actual pitched sound to come interactive
out of methods of research and implementa-
instrumental performance,
the computer, I instead played the flexible performer-computertion
same
with were developed as we worked, to
inter-
chord on the piano. This moment solve the balance between what was
en- that would produce live generative
action
capsulated for me a genuinely new
computer algorithms and give thepractical
player or possible artistically and
approach to the piano - one that could
both technically.
significant control and room to be
only be enabled by this technology - by a computer's responses.
surprised The We
team of four technologists and I
creating a fascinating and intimate worked for an intense week of iterative
hoped to answer some of the challenges
space in which to explore the relation- discussed as far back as 1973 by Cornock
prototyping. Working in the same room,
ships of the sensors to the pianistic and Edmonds [21], creating a circular
with ongoing experimentation as part of
performance. the development
performative feedback loop to make the process, we designed,
This scenario might never have been relationship between algorithms implemented
and the and tested the technolo-
realized without improvisation, as it was gies with active feedback from me. Im-
physicality of the performance seamless
the combination of Tanaka and his as- and meaningful. We wanted the per-
provisation formed the bedrock for our
sistant's ideas and computer expertise research, as the key focus was always
former to provide input to generative
and my own pianistic approach (again, how the technology could be used to
algorithms - responding to and modify-

52 Nicolls, Seeking Out the Spaces Between

This content downloaded from 193.170.197.53 on Mon, 04 Feb 2019 14:36:23 UTC
All use subject to https://about.jstor.org/terms
allow the performer to contribute to live Use of Performance worked in the pressure of a real-time en-
algorithms, manipulate the feedback and as a Research Method vironment, and whether this integration
create an engaging spectacle. Prior to the final performance, weof
held
technology with musical performance
two pilot performances to further de-
succeeded aesthetically. I made the fol-
Compositional Process velop the piece, the first of which lowing
took crucial discovery: If the technol-
In an early brainstorming session we place at the end of the first week ofogy
col-is already complex, then the actual
came up with the idea of using a hat to musical substance can be more direct or
laboration. Limited by the then-current
house a Bluetooth triple-axis acceler- simplified without lowering the overall
state of the technology, this performance
ometer (iCube Gforce 3D-3 vl.l). This was incredibly informative for us, provid-
complexity of the artistry.
would give us an easily identifiable and ing instant feedback. This question of where complexity
highly performative input mechanism. For the programmers the perfor- resides is also a powerful design point:
I then asked the technologists to show mance provided the opportunity to If as-
one desires the gross result to be bal-
me their current work, to get an idea of sess the usability of the technology in then cases where more complex-
anced,
what might be possible in the time we the context of a unified musical piece, ity occurs at the interface level should
had (approximately 2 months to the final observing how well the necessary tran- be balanced by less complexity in the
performance, with 3 weeks allotted col- sitions between elements of the system detailed physical control: that is, some-
laborative working time).
The piece evolved as a section-by-sec-
tion improvisation; I would practice with
pre-built systems or patches and suggest
Fig. 6. First prototype of Nicolls's piano. (Photo © Sarah Nicolls)
ideas that could be created immediately.
The use of improvisation was a vital
mechanism for me to understand what

the pre-existing software systems did and


how they might interact with the live,
acoustic piano sound. Having decided
upon the hat as a major input device, I
also wore it while exploring different pia-
nistic textures and simultaneously seek-
ing out a gestural language with my head.
Our final output was a 20-minute
work for grand piano, electronic sound
and mechanical devices created using a
MIDI controller and pedal, nine DC mo-
tors and the top hat (Fig. 4) (the per-
formance can be viewed on-line [22]).
The piano was placed in the middle of
a quadraphonic speaker system, using
contact microphones to avoid feedback.

Technology Notes
The sensors in the hat triggered an algo-
rithm, which we developed to map the tilt
of the accelerometer to a 2D parameter
space with x and y mapped to the pitch
and temporal dispersion parameters of
a granular synthesis effect (Fig. 5). For
the looping patch, analysis of the spectral
range of the live piano audio was used
as an onset and offset detector to trigger
suitable start and end points for the loop.
These loops were continuously stored as
indices of an audio buffer with a memory
of 1 minute. New onset events triggered
the playing of previously recorded loops
in a stochastic manner. The system was
designed so that the performer would be
able to fix the loop being used so that
it could provide a repetitive background
for further improvisation. The rhythmic
patterns used by the motors and piano
samples were generated using Markov
chains organized by varying degrees
of predictability, selectable by the per-
former using a MIDI controller.

Nicolls, Seeking Out the Spaces Between 53

This content downloaded from 193.170.197.53 on Mon, 04 Feb 2019 14:36:23 UTC
All use subject to https://about.jstor.org/terms
thing with a lot of buttons or faders that It was interesting to note the differ- Acknowledgments
need to be used frequently might not be ent approaches of artist and scientist in The HCI:UK 2008 collaboration was made pos-
best paired with detailed muscle tension this context - the desire for immediate
sible by a (re)Actor3 Artist in Residence Commis-
sion, sponsored by the Centre for Digital Music,
control. Similarly, if balance is sought visible or tangible elements with which
Queen Mary, University of London, and produced
between predictability and unpredict- to play, measured against a detailed,
by BigDog Interactive Ltd. My initial research into
ability, then placing features with resultsthorough and quite time-consuming interactivity was funded by the Arts and Humanities
Research Council; the development of PianoLab was
that cannot be predicted in certain "ar- process of making the technologyenabled
do by a Brunei Research Innovation and En-
eas" of one's instrument can create useful what we wanted. Michael Zbyszyñskiterprise Fund. I am grateful for further PianoLabs,
creative springboards. [23] made reference to these paral-including those at CNMAT, Newcastle University
and the University of Cincinnati, and for input from
I also made discoveries about gestural lel demands when discussing a similar
Pierre Alexandre Tremblay for Case Study 2.
control of algorithms while playing, project
in with Frances-Marie Uitti at the
particular related to the advantages Center
of for New Music and Audio Tech-
References and Notes
the sensors placed in the hat: They af-
nologies and the need for technologists
to work much faster and on-the-fly1.than
forded physically communicative control
Chikashi Miyama (Angry Sparrow) and Ben Knapp
(with Eric Lyon, Gascia Ouzounian), NIME 2009,
without impinging on or affecting the would be appropriate for securing Concert tech- 2, Friday, 5 June 2009. See <nime2009.org/
nology for an actual performance.
piano playing itself and thus were eas- In
concert.pdf> (accessed 28 December 2009).
ily isolated from the other elements reality,
of the balance between time in theHölzer at noise=noise, Goldsmiths, Uni-
2. Derek
the music at any point. Feedback from laboratory and the time apart worked versity of London, 17 February 2009, curated and
organized by Ryan Jordan.
the audience was also immensely valu- best for technological and musical
able and was acted on in the followingdevelopment. 3. SSS (Atau Tanaka, Cecile Babiole, Laurent Dail-
leau) , stitched-up, sk-interfaces closing event, FACT
performance: The sensors used in the Some points I had discovered previ- (Foundation for Art and Creative Technology) , Liv-
ously were reinforced - for example,
first performance were placed in an in- erpool, 29 March 2008. See <www.fact.co.uk>.
conspicuous hat, and we noticed that that improvising with algorithmic sys- 4. Miyama and Knapp [1].
several audience members did not make tems can stimulate greater artistic free-
5. See <o-art.org/history/Computer/CCRMA/
dom or range. Other discoveries were
the connection between the gestures and CCRMAres.htmb (accessed 28 December 2009).
new; one of the most important of these
the sound manipulations, as they seemed
6. E. Jessop, "The Vocal Augmentation and Ma-
was about the actual language of mu-
intrinsic to the performance. As a result, nipulation Prosthesis (VAMP): A Conducting-Based
the sensors were placed in a top hat sic
forin interactive music. It is a potential Gestural Controller for Vocal Performance," demon-
stration, NIME Pittsburgh, 2009.
the next performance, which the audi- equation: If the understanding of the in-
ence found very straightforward. teractivity in the audience's perception 7. J. Richards, "Lost and Found: The Mincer,"
Leonardo Electronic Almanac 15, Nos. 11-12 (2008).
lends greater weight to the performative
Available at <leoalmanac.org/journal/Vol_15/lea_
Reflections from communication, then demonstrative vl5_nl l_12/JRichards.asp>.
the Technologist(s) and simply made gestures or musical
8. Bob Ostertag, "Human Bodies, Computer Music,"
material can carry the message most
The experience of working toward spe- Leonardo Music Journal 12 (2002) p. 11.
cific artistic goals, as opposed to scientific
effectively.
9. G. Essi and S. O'Modhrain, "Enaction in the Con-
ones, was both a novel and a rewardingOverall, the project served to high-text of Musical Performance," Interdisciplines virtual
experience. With the focus on work- light the need for and benefit of thisworkshop (by participants in Enactive interfaces Net-
kind of collaborative lab-based work, work) (2004) p. 1. Available at <www.interdisciplines.
able real-time implementations, while org/enaction/papers/ 1 7>.
demanding the technology produceespecially
a when dealing with interactive
10. C. Cadoz and M. Wanderley, "Gesture - Music"
technology.
subjectively interesting aesthetic, the pro- in M. Battier and M. Wanderley, eds., Trends in Ges-
cess led to extended discussion and large tural Control of Music (Paris: Editions IRCAM, 2000)
output from all involved. In comparison pp. 71-93.
Current Developments
to work developed in a laboratory, the in- U.E. Miranda and M. Wanderley, New Digital Musical
stant feedback from the pianist allowed At the time of writing I haveInstruments:
begunControlex- and Interaction beyond the Keyboard
(Wisconsin: A-R Editions, 2006) p. 10.
perimenting with a purely live sampling
quick identification of creative dead ends
and forced the focus upon the most in- scenario, where I can grab the12.sounds
Arts and Humanities
I Research Council project,
May- December 2007.
am currently playing by reaching into a
teresting ideas, with little misunderstand-
ing. Development "onsite," with the 13. the
particular point in the air above Richardkey-
Barrett, Adrifi, PSI 09.10 CD. Recorded
live at The Warehouse, London, 29 November
ability to immediately test ideas, led toboard and then manipulate these with at <www.squidco.com/miva/mer-
2007. Available
the elimination of erratic technological different gestures. This research is be-
chant.mv?Screen=PROD&Store_Code=S&Product_
Code=12431>.
behavior and a convergence toward the ing undertaken with Nick Gillian at the
technologies of the final piece. Sonic Arts Research Centre, Belfast, us-
14. Performances: FURT, Red Rose pub, Finsbury
Park, London, spring 2007; and fORCH, Spitalfields
ing a Polhemus magnetic tracking device
Festival, summer 2007. See <furtlogic.com>.
Reflections from the Performer [24]. Setting up PianoLab as a perma-
15. See <web.mac.com/sarahnicolls/research/
In developing interactive performance, nent space is a long-term goal, and over
*Barrett.html>.
the next 3 years we will also be develop-
the complexity of the potential relation-
16. Richard Barrett, personal e-mail, December 2009.
ships between gesture and sound is muching further prototypes of the new piano
(Fig.
better expressed in real-time demonstra- 6) [25]. To complete the circle,
17. First performance, University of Huddersfield,
tion than in remote conversations or, U.K., 6 March 2009.
this itself was the result of improvisation:
worse, written debate. For choosing theduring the first PianoLab, I dismantled a
18. First performance, Huddersfield Contemporary
Music Festival, Huddersfield, U.K., 21 November
correct input signal, guiding it through apiano, hanging the soundboard2009.
from the
See <www.hcmf.co.uk>.
relevant process and producing a mean-ceiling; while it hung there, I began to
19. Whether to "reveal the magic and do the trick
ingful output, I found the laboratoryimagine re-attaching a keyboard to it, to
anyway" - Augusto Corrieri summing up our pro-
method to be extremely liberating andcreate a new spatial relationship between
cess for Soundwaves Festival 2007 <web.mac.com/
time-saving. keyboard and strings. sarahnicolls/research/*Augusto_Corrieri.html>.

54 Nicolls, Seeking Out the Spaces Between

This content downloaded from 193.170.197.53 on Mon, 04 Feb 2019 14:36:23 UTC
All use subject to https://about.jstor.org/terms
20. Centre for Digital Music at Queen Mary, Univer- 25. Sunday Lunch Club series organized by Prototype Michael Edwards / kill by proxy on sumtone records
sity of London. See <www.elec.qmul.ac.uk>. Click on Theatre <www.proto-type.org>. <www.sumtone.com/recording.php?id=31>.
Research/Centre for Digital Music.

21. S. Cornock and E. Edmonds, "The Creative Pro- Discography Manuscript received 1 January 2010.
cess Where the Artist Is Amplified or Superseded by The piano music of Niccolo Castiglioni on Metier <www.
the Computer," Leonardo 6, No. 1 (1973) pp. 11-16. amazon.com/s?ie=UTF8&field-artist=Niccolo%20 Sarah Nicolls is a pianist specializing in
22. See <web.mac.com/sarahnicolls/research/
Castiglioni&rh = n:5174,p_32:Niccolo%20 contemporary music and live electronics,
*machines.html>
Castiglioni&page=l>. regularly performing concerti with the Lon-
Alexander's Annexe Push Door To Exit on WARP don Sinfonietta and featured on BBC Radio
23. M. Zbyszyñski, "Augmenting the Cello" Records
(NIME <http://warp.net/records/releases/
3. Nicolls also plays in Alexander's Annexe
Paris 2006). alexanders-annexe/push-door-to-exit>. (Warp Records). Nicolls is a Senior Lecturer in
24. See experiment at <www.youtube.com/ Richard Barrett's Adrift on psi records Music
<www.at Brunei University. See also <www.
watch?v=EA90JC9PUKg> emanemdisc.com/psiO9.html> sarahnicolls. com>.

Nicolls, Seeking Out the Spaces Between 55

This content downloaded from 193.170.197.53 on Mon, 04 Feb 2019 14:36:23 UTC
All use subject to https://about.jstor.org/terms

You might also like