You are on page 1of 4

Emerging Technologies for

Real-Time Diffusion Performance

Bridget Johnson abstract

W ith the ascendance of the


field of new interfaces for musi-
cal expression, a new phase of

F
sound diffusion has emerged.
Rapid development is taking
place across the field, with a
or over half a century, the performance para- were dispersed. This began with re- focus on gestural interaction
search into the psychoacoustics of and the development of custom
digm of sound diffusion has centered on the performer us-
performance interfaces. This
ing a mixing desk as a controller. While much development human hearing, leading to more ac- article discusses how compos-
has taken place regarding studio spatialization techniques curate pan-pot laws for stereo pan- ers and performers embracing
and rendering algorithms, until recently the performance ning [3], and were furthered again technology have broadened the
interface for diffusion has seen little change. A recent trend in the 1990s with developments in boundaries of spatial perfor-
mance. A particular focus is
in diffusion performance is the application of new musical vector-base amplitude panning [4],
placed on performance inter-
interfaces. wave field synthesis and higher- faces built by the author that
order ambisonics. The new tech- afford the artist more control
nologies encouraged new spatial over performative gestures.
History aesthetics, allowing composers to These new works serve as
examples of the burgeoning
In 1951, Pierre Schaefer and Pierre Henry presented the poten- conceive spatialization through a field of diffusion performance
tiomètre d’espace, a diffusion system with which they performed focus on the creation of holophonic interface design.
precomposed electroacoustic music by dynamically spatializ- sound fields and phantom sources.
ing sounds through a tetrahedral speaker array. The two artists Tools for the control of spatial-
built an interface of potentiometers to control the gain of each ization algorithms found their way
speaker and, thus, the spatial field [1]. The diffusion concert into digital audio workstations, allowing the composer to
is a tradition that remains active throughout the world. Over drag a virtual representation of a sound object and place it
the last 70 years, diffusion has mostly been performed on a within a speaker array. The amplitude mapping of faders was
desk of faders in a manner similar to that of the potentiomètre counterintuitive for pantophonic motion (i.e. circular spatial
d’espace. After the initial quadraphonically based systems, many trajectories): Most spatial user interfaces became graphical.
institutes began to develop larger speaker orchestras, notable In spite of this, the mixing desk continued to be the user in-
examples of which include the GRM (Groupe de Recherches terface for diffusion performance. With composers thinking
Musicales) Acousmonium, BEAST (Birmingham Electro- and acting one way in the studio and another in the concert
Acoustic Sound Theatre), the Gmebaphone and later the hall, the paradigm was ripe for disruption. In the studio, com-
ZKM Klangdom. The large scale of such systems meant new posers have as much time as needed to place sounds precisely
ways for controlling and calculating the spatialization needed where desired and to trace out specific trajectories with the
to be devised. mouse onscreen. In performance, the luxury of time is dimin-
Through the 1980s and 1990s, these traveling speaker or- ished, and gestural relationships become more relevant. The
chestras continued to diversify. They began to include more holophonic sound field and its effect on a composer’s way of
speakers, requiring sophisticated routing systems. However, thought emphasized the need for a new performance inter-
the user interface that drove these systems remained largely face, as the ergonomics of the mixing desk often hinder the
unchanged, with systems continuing to use a mixing desk as potential trajectories available to the performer. This problem
the main form of user interaction. This lack of change is un-
derstandable given the diffusion performance practices in Fig. 1. The tactile.space diffusion interface running on the multi-
vogue at the time [2]. The diffuser’s actions focused on the touch table Bricktable. (© Bridget Johnson. Photo © Jason Wright.)
overall perception of the piece in the environment rather than
the placement of a sound object in a discrete location. The
audience’s perception of the spatial field was a function of its
position in the hall. These concerts tended to take place in a
traditional configuration: the diffuser was positioned in the
sweet spot, with the audience seated behind or in front of the
desk and with little to no view of the diffuser.
As spatialization algorithms became more sophisticated,
composers were able to think about where they wanted to place
their sounds within the space, rather than merely the way they

Bridget Johnson (researcher, student), New Zealand School of Music, Victoria


University of Wellington, Gate 7, Kelburn Parade, Wellington, New Zealand.
Email: <johnsonbridget.d@gmail.com>.

Supplemental materials such as audio files related to this article are available
at <https://vimeo.com/98397876>.

©2014 ISAST   doi:10.1162/LMJ_a_00188 LEONARDO MUSIC  JOURNAL, Vol. 24, pp. 13–15, 2014      13
is well recognized within the field [5–7]. in a real-time spatialization. tactile.space were experienced with the Bricktable.
In the last 10 years, a number of research not only made it easy for artists to per- Unlike the Bricktable, the iPad has no
teams have offered a variety of solutions form complex spatial trajectories that setup time and connects to the spatializa-
to the problem through the design of are difficult via a mixing desk but also tion software running on a host PC via a
new diffusion performance interfaces. featured control of spatial spread and dis- wireless network. It requires no calibra-
tance encoding. By placing a second fin- tion and is not affected by stage lighting.
ger inside an audio object, the user is able The short setup time, stability and intui-
Multi-Touch Solutions to spread the object into an arc to widen tive GUI of this new interface affords the
The introduction of the Reactable [8] in the perceived sound source. The arc’s user more time to focus on the spatial
2005 saw the wider electronic music com- position and distance can be adjusted by aesthetics and performance.
munity embrace the use of multi-touch moving the circle drawn in the arc’s cen-
surfaces as performance interfaces. The ter; the width can be adjusted by moving
majority of early applications for such de- either of the two circles at the arc’s edge. The Chronus Series
vices focused on synthesis models; it be- The arc can be spread into a full 360º I designed and built the Chronus family
came apparent that such interfaces have circle to completely immerse the audi- of new diffusion interfaces to explore in-
use not only in performance but also for ence [15]. tuitive diffusion performance with physi-
collaborative installation and as studio tactile.space has now been ported to cal hardware [16]. The series features a
tools. Research teams [9,10] have ex- the iPad as tactile.motion. This new GUI rotary encoder–based design for spatial
plored the development of multi-touch follows the tactile.space visual aesthetic positioning in a pantophonic field. The
studio mixing tools, which have included and has many of the same features and rotary encoder can be continually rotated
spatial rendering; however, with the ex- modularity. tactile.motion (Fig. 2) also in- past the point of 360º, allowing the angu-
ception of the SoundScapeRenderer troduces new functionality to encourage lar displacement to be directly mapped
[11], they have largely been limited to the creation of dynamic spatial fields. to a spatial position. Chronus_1.0 fea-
stereo or quadraphonic speaker systems. Specific intuitive gestures are recognized tures intuitive angular control but limits
In 2011 I created tactile.space [12] to by the system and used to trigger autono- expressivity by not including control of
run on Bricktable [13] (Fig. 1). While mous spatial behaviors. For example, if distance or spatial spread.
music performance applications had the user moves an audio object in a cir- With hardware enhancements, Chro-
been previously built for Bricktable [14], cular motion, the system is able to recog- nus_2.0 includes a slide potentiometer
tactile.space was the first designed for dif- nize the user’s intention to draw a circle placed on top of the spinning disc, al-
fusion. tactile.space allows users to input and will continue the spinning motion lowing the performer to intuitively con-
the number of speakers and audio files at the velocity drawn by the user. Like- trol both a sound object’s angular and
desired as well as other customizable user wise, linear trajectories are recognized radial positions (Fig. 3). The positions
settings. Users are then presented with a and continued by the system. A double are read by an Arduino microcontroller
graphical user interface (GUI) through tap on the audio object stops the motion. that sends data via serial protocols to be
which they can drag visual representa- In addition to implementing many new unpacked in custom-built Max or pro-
tions of each of their sound files (audio features, the iPad version has addressed cessing patches. The generic nature of
objects) into desired locations, resulting many of the performance issues that the messages sent by the Chronus_2.0 in-
terface allow for ease of integration into
existing systems: A central goal in the
design of the Chronus series was that it
Fig. 2. The new tactile.motion diffusion performance iPad application. (© Bridget Johnson) prove easy for any diffusion artist to adapt
to the new interface without limiting or
affecting their current spatialization sys-
tem. The interface is also designed to be
easily incorporated into a live electronic
musician’s current setup, encouraging
the use of spatialization technology in
systems not previously equipped for such
performance techniques.

Conclusions
The last 10 years have seen rapid devel-
opment in performance interfaces used
for sound diffusion. This development
has been catalyzed by artists’ desire for
heightened control and wider expressiv-
ity and the increasing democratization
of hardware and software developments.
While these new interfaces have their
own limitations and inconsistencies,
they have also pushed the development
of spatial performance. Diffusion con-
certs featuring new interfaces have ex-
hibited a new range of spatial aesthetics:

14       Johnson, Emerging Technologies for Real-Time Diffusion Performance


References and Notes 11. K. Bredies, N. A. Mann, J. Ahrens, M. Geier,
S. Spors and M. Nischet, “The Multi-Touch Sound-
1. N. Barrett, “Trends in Electro-acoustic Music,” scape Renderer,” in Proceedings of the Working Confer-
in N. Collins and J. d’Escrivan, eds., The Cambridge ence on Advanced Visual Interfaces (New York, 2008).
Guide to Electronic Music (New York: Cambridge Univ.
Press, 2007). 12. B. Johnson and A. Kapur, “tactile.space: A Multi-
Touch Tool for Live Sound Diffusion,” Proceedings
2. S. Emmerson, “Diffusion Projection: The Grain of
of Australasian Computer Music Conference (Brisbane,
the Loudspeaker,” in S. Emmerson, Living Electronic
Australia: 2012).
Music (Hampshire, U.K.: Ashgate, 2007).

3. D. Griesinger, “Stereo and Surround Panning In 13. J. Hochenbaum and O. Vallis, “BrickTable: A Mu-
Practice,” Audio Engineering Society Convention sical Tangible Multi-Touch Interface,” in Proceedings
Paper (Munich, 2002). of Berlin Open Conference 09 (Berlin, 2009).

4. V. Pulkki, “Virtual Source Positioning Using Vector 14. J. Hochenbaum, O. Vallis, D. Diakopoulos,
Fig. 3. The Chronus_2.0 physical user Base Amplitude Panning,” Journal of the Audio Engi- J. Murphy and A. Kapur, “Designing Expressive Musi-
­interface for spatialization performance. neering Society 45, No. 6, 456–466 (1997). cal Interfaces for Tabletop Surfaces,” in Proceedings
(© Bridget Johnson) of New Interfaces for Musical Expression (Sydney, 2010).
5. B. Truax, “Composition and Diffusion: Space in
Sound in Space,” Organised Sound 3, No. 2, 141–146 15. tactile.space was evaluated by composer-perform-
(1999). ers; the results can be found in B. Johnson and
A. Kapur, “Multi-Touch Interfaces for Phantom
Performers can focus on the perceived 6. Mooney, “Sound Diffusion Systems for the Live
Source Positioning in Live Sound Diffusion,” Pro-
Performance of Electroacoustic Music,” Ph.D.
position and movement of their sounds. dissertation, University of Sheffield, Sheffield, U.K.,
ceedings of New Interfaces For Musical Expression, KAIST
(Republic of Korea, 2013).
Live spatial performance is also featured 2005.
more regularly as part of live electronics 7. K. Brown, M. Alcorn and P. Rebelo, “Sound Dif- 16. B. Johnson, M. Norris and A. Kapur, “The Devel-
systems, as many of these new interfaces fusion Using Hand-Held Light-Emitting Pen Con- opment of Physical Spatial Controllers,” in Proceed-
trollers,” Proceedings of International Computer Music ings of New Interfaces for Musical Expression (London,
are easily incorporated into performers’ Conference (Barcelona, 2005). 2014).
wider electronic setups.
8. S. Jordà et al., “THE REACTABLE,” Proceedings
The new interfaces I have presented of New Interfaces for Musical Expression (Vancouver,
here are representative of the future 2005). Manuscript received 2 January 2014.
direction of the paradigm. They each 9. J. Carrascal and S. Jordà, “Multitouch Interface for
exhibit new interactive qualities for the Audio Mixing,” Proceedings of New Interfaces For Musical
Bridget Johnson is a Ph.D. candidate at the
performer, both with the space and Expression (Oslo, 2011).
New Zealand School of Music in Wellington,
the audience. I hope that these new 10. S. Gelineck, D. Overholt, M. Buchert and New Zealand. Her research focuses on the
interfaces will further stimulate prog- J. Anderson, “Towards an Interface for Music Mixing
design and development of new gestural in-
Based on Smart Tangibles and Multitouch,” Proceed-
ress in the burgeoning field of spatial ings of New Interfaces for Musical Expression (Daijon, terfaces for increased expressivity in diffusion
performance. Republic of Korea, 2013). performance.

Johnson, Emerging Technologies for Real-Time Diffusion Performance      15


Copyright of Leonardo Music Journal is the property of MIT Press and its content may not be
copied or emailed to multiple sites or posted to a listserv without the copyright holder's
express written permission. However, users may print, download, or email articles for
individual use.

You might also like