Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Standard view
Full view
of .
Look up keyword
Like this
0 of .
Results for:
No results containing your search query
P. 1
Patron-Generated Music: Stimulating Computer Music Expression in Nightclubs

Patron-Generated Music: Stimulating Computer Music Expression in Nightclubs

Ratings: (0)|Views: 743 |Likes:
Published by Yago de Quay
Paper submitted to CMMR 2011 conference in India.
Paper submitted to CMMR 2011 conference in India.

More info:

Categories:Types, Speeches
Published by: Yago de Quay on Jun 18, 2011
Copyright:Attribution Non-commercial


Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less





Yago de QuayFaculdade de Engenharia da Universidade do PortoR. de Mouzinho da Silveira 317 2º4050-421 Porto, Portugalyagodequay@gmail.com
 Recent studies suggest that nightclubs are ideal places for multimedia user participation,however little work has been done in adequately implementing interactive music systems. The objective of this study is to test the use of Motion Capture, Music Information Retrieval and Body-music Mapping, in fostering music expression by patrons in nightclubs. The system was implemented in three nightclubs inPortugal and Norway and feedback was collected from patrons, DJs, and nightclub staff. Results suggest that the use of different Motion Capture systems and simple Body-music Mappings encourages higher  participation, and that Music Information Retrieval can correctly provide harmonic content of westernclub music. The study concludes that this system can provide a real-world alternative for musicalinteraction in mainstream nightclubs.
music information retrieval, motion capture, body-music mapping, nightclubs, interaction
Nightclubs are popular, excellent multimedia places for user participation as pointed out by Bayliss, Lock & Sheridan [1], nonetheless, Gates, Subramanian & Gutwin [2] state that little is known about howinteraction occurs inside these venues, and current implementations of interactive systems within thiscontext are limited to non-musical audience-to-audience interaction. Furthermore, Bertini, Magrini &Tarabella [3], Lee, Nakra & Borchers [4], and Futrelle & Downie [5] observed that a gap exists betweeninteractive music research results and general real-life interactive music practice. The limited bibliographyon music interaction in dance clubs, like the work done by Ulyate & Bianciardi [6], deliberately side stepthe current role of the DJ.This paper combines the domains of Music Information Retrieval (MIR), Motion Capture (MoCap) andBody-music Mapping to suggest a practical systemic approach for stimulating user's musical expression innightclubs. Results were presented in three events in Norway and Portugal between April and October2010.
The first event was debuted in a small club located in the Portuguese suburbs, frequented mostly by
teenagers with a preference for “trance” music. The second and third events were exhibited in one
renowned nightclub located in the center of Norway's capital, also heavily frequented by teenagers with a
 preference for “top 40” dance hits. Results were collected from feedback by and users, DJs and nightclub
staff and well as field tests. Based on the information provided by the staff over 400 people attended thethree events.
Motion Capture
Motion Capture (MoCap) refers to the process of storing human movement into a digital format. MoCaptechnologies are either:
, relying on computer vision techniques, or
, based on sensors.Applications are mostly limited to the film industry, army and medicine [7-9].In their study of controllers used for computer musical interfaces, Marshall & Wanderley [10] concludedthat depending on the musical task, users express a preference for certain sensors. Both optical and non-optical MoCap technologies were implemented in the three events. These were mapped to sound effects
and musical events. The optical system featured an Optitrack infrared camera mounted on the ceilingtowards a platform against a wall (Fig. 1). Motion was analyzed by calculating the
quantity of motion
, thatis, calculating the sum of all active pixels in a video feed. The non-optical system consisted of two Wiiremotes (Fig. 2). Accelerometers inside these controllers provided information on how the they werebeing swung. To stimulate interaction, lights were positioned on the raised platform where the Optitrack camera was pointing, and colorful ribbons were attached to the Wii Remotes.
Music Information Retrieval
Music in a nightclub is inherently unpredictable
the theme of an event, expected audience, club layout,club owner and club promoter are some of the several factors that influence the DJ's selection of songs [2].According to Gouyon
et al.
[11] M
usic Information Retrieval (MIR) “aims at understanding andmodeling, with the help of computers, aspects of the sound and music communication chain”.
The MIR software was built on the MAX/MSP programming environment and relied on three pitchtrackers: zsa.freqpeak~, segment~, and analyzer~. By interpolating the values of each pitch tracker, thesoftware was able to extract in real-time the harmonic contents of the songs played by the DJ and providea selection of suitable notes and chords. The MoCap systems would then manipulate the notes and chordsextracted from the music.
Body-music Mapping
Miranda & Wanderley [12]
define mapping as “the liaison or correspondence between control parameters
(derived from performer actions) and sound synthesis parameter
s”. Mapping
s may involve the clarity andtime between a systems input/output, interaction complexity, predictability, and psychologicalassociations between movement and sound.Quantity of motion detected by the Optitrack infrared camera was applied on a threshold to trigger noteswhile acceleration data from both Wii remotes separately triggered notes and manipulated the frequencyof a band pass filter acting on chords. Notes and chords were selected from the Music InformationRetrieval patch and played on two local speakers assigned to each motion capture installation. Onlypeople close to the MoCap systems could hear the notes and chords, leaving the main dance floorunaffected.
The three events were carried out in two Nightclubs with different infrastructures, music styles and clublight effects. The MIR software performed well and no off tune notes were observed. The MoCap cameraworked best when filming against a wall to eliminate background visual noise. To avoid interference withclub lights, the camera had to accept infra-red (IR) light and filter out visible light. The blue toothcommunication used by Wii Remotes was robust and could extend up to 10 meters. The remotesthemselves were resistant and withstood hard thrashing.
Figure 1 -
Patron dancing on the platform
Figure 2
- Participants swinging the Wii remotes
Patrons, DJs, nightclub staff, and observation provided feedback as to how the systems were being used.Findings suggest that simple mappings, fast sonic response and visual feedback worked best. Users foundthe Wii Remotes very easy to manipulate, and the movement to sound relationship clear. The Optitrack camera's intention was more obscure because it lacked visual feedback. Some patrons did not engage withthe systems because the interaction required them to move too much. Initially, users were captivated bythe sounds they produced, most however, soon lost interest with the limited possibility of sound effects.
This paper suggested a method of combining music information retrieval, motion capture technologies andbody-music mapping to foster adhoc music making by patrons inside nightclubs. Patrons could makemusic through an Optitrack Infra-red camera and Wii remotes, while a Music Information Retrieval (MIR)software provided harmonic content by analyzing the DJ's music. The most challenging part consisted indeveloping tools that could be flexible to each nightclub's environment and stimulate user participation.Although proven robust, the Music Information Retrieval patch was only tested with western music andmight not be suitable for other styles. In his summary of the progress in the field of MIR, Downie [13],argues that there is limited research non-western music.Works by Blaine & Perkins, [14] Tahiroglu & Erkut [15] and Feldmeier & Paradiso [16] with groupcomputer music making suggest that further work can be done on implementing group music expressiontools in public venues.Results encouraged more work on visual feedback, audio effects and passive systems for observers. Theysuggest that interactive events may become an attractive entertainment alternative in nightclubs and toolsare robust enough for DJs, musicians, club owners and researchers to apply and build upon. Mostimportantly, it seems that innovative ways of using new technologies are steadily overcoming barriers foradoption of interactive music and pushing the boundaries of participation in nightclubs.
Nightclubs are ideal multimedia places for user participation. But despite the n
ightclubs’ emphasis on
music, there has been little implementation of interactive music systems, and the ones that exist have side
stepped the current role of the DJ. This study’s results suggest that Motion Capture, Music Information
Retrieval, and Body-music Mapping can provide a practical alternative for computer music expression inNightclubs. However, further work is needed to extend and replicate these findings, and to develop socialtools for music making.
Kristian Nymoen for helping develop part of the computer vision tools, COST SID for partially fundingthe project, and fourMs lab (Oslo) for providing many of the materials.
1. Bayliss, A., S. Lock, and J.G. Sheridan.
 Augmenting expectation in playful arena performanceswith ubiquitous intimate technologies
. in
PixelRaiders 2
. 2005. Sheffield.2. Gates, C., S. Subramanian, and C. Gutwin,
 DJs' perspectives on interaction and awareness innightclubs
, in
Proceedings of the 6th conference on Designing Interactive systems
. 2006, ACM:University Park, PA, USA. p. 70-79.3. Bertini, G., M. Magrini, and L. Tarabella,
 An Interactive Musical Exhibit Based on Infrared Sensors
, in
Computer Music Modeling and Retrieval
, R. Kronland-Martinet, T. Voinier, and S.Ystad, Editors. 2006, Springer Berlin / Heidelberg. p. 92-100.4. Lee, E., T.M. Nakra, and J. Borchers,
You're the conductor: a realistic interactive conductingsystem for children
, in
Proceedings of the 2004 conference on New interfaces for musicalexpression
. 2004, National University of Singapore: Hamamatsu, Shizuoka, Japan. p. 68-73.

You're Reading a Free Preview

/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->