You are on page 1of 10

Program: PhD in Digital Media Candidate: Yago de Quay

Interactive Music 3.0:
User-driven adhoc nightclubs

Date: July 13, 2011

New technologies and models for networked ubiquitous interaction are changing our relationship to media, people and things. This research proposes making these tools available for users to change music to better suit their needs in a nightclub environment.

We live in a world of user generated content. Media shared by users from various social media platforms such as Flickr, Facebook, Twitter, and YouTube has become mainstream. Users themselves can create and share plenty of knowledge that is interesting and accurate to a broader audience. In the music industry, more artists are becoming aware of the value of social media, with musicians such as Imogen Heap collaborating with her fans on lyrics, remixes, and artwork. All of this is part of the rise of co-creation in innovation; also a product of social interaction whereby users reinterpret and reinvent multimedia content. Nightclubs are popular, excellent multimedia places for user participation as pointed out by (Bayliss et al. 2005). Nonetheless, (Gates et al. 2006) state that little is known about how interaction occurs inside these venues, and current implementations of interactive systems within this context are limited to non-musical audience-to-audience interaction. Furthermore, (Bertini et al. 2006), (Lee et al. 2004), and (Futrelle and Downie 2003) observed that a gap exists between interactive music research results and general real-life interactive music practice. The limited bibliography on music interaction


Interactive Music 3.0 Mission|Action Plan|Work Packages|Timeline

Work Plan Yago de Quay

in dance clubs, like the work done by (Ulyate and Bianciardi 2001), deliberately side step the current role of the DJ. Previous interactive music systems framed the system as the platform in terms of the old centralized, software paradigm. In the end, we will see both interactive music software and hardware turning out to be commodities, and value will move to higher level services delivered over a network. This PhD proposes extending this network to human and machine activity to pave the way for exploitation of data capture and communication capabilities to our physical world. From this continuous interaction with other people and things emerge complex musical systems that can have a profound effect on art as a whole. Interactive Music 3.0 is a term defined here as a system that generates music using the Web 3.0 approach to leveraging, through ubiquitous technology, user activity and algorithmic data management to reach out to an entire network (O'Reilly 2006). User-driven music – Interactive Music 3.0 – will aggregate users’ social behaviors and individual attitudes towards dancing in clubs enabling people to create, modify, reconfigure and share dancing music in real-time through pervasive, networked devices. This PhD will focus on the role of self-expression in the domains of Motion Capture, Interactive Nightclubs, Music Information Retrieval and Body-music Mapping. It will also devise experiments that measure dancers’ satisfaction of interactive music/dance systems in comparison with the noninteractive, traditional, music/dance experience. Some of the research questions are: What are the barriers to adoption of interactive systems in nightclubs? How much control over music do patrons want? How will musical quality be ensured? Given the popularity of music video games like Dance Dance Revolution, it is expected that dancers will derive greater enjoyment from music when their actions have immediate and direct influence on the music’s parameters. The results will inform research in body-music mapping and help develop sustainable interactive music models for nightclubs as well as other learning technologies. And if we broaden the network possibilities to disguised, embedded, unsupervised machines, we can look forward to a musical system that not only deduces mappings from interaction but, most importantly, can connect users across the world to create, share, and improvise music and dance in real-time. In contrast to existing work, I intend for the output of this mission to be made accessible, inexpensive, and easy to implement in public environments. I plan for the final interactive system to be scalable and permit interplay between people with different physical and musical abilities. Furthermore, using social web instruments, I hope to raise awareness and help the exchange of knowledge in this field of research.

Interactive Music 3.0 Mission|Action Plan|Work Packages|Timeline

Work Plan Yago de Quay

Figure 1 - Action Plan

The action plan will be divided into 5 work packages. Work Packages 3 & 4 will have routine qualitative and quantitative testing's on available users, and will be inspired by online community feedback.

Interactive Music 3.0 Mission|Action Plan|Work Packages|Timeline – WP 1

Work Plan Yago de Quay

WP 1

The first package will focus on the dissemination so that we take full advantage of techniques such as viral marketing and social learning to bring in individuals as well as communities to influence procedure and increase the reach of this study’s results, tools, practices and insights. Work package 2 will determine the theoretical and methodological modus operandi to be followed throughout the mission. It will decide the gesture mapping and music composition strategy. This stage concerns the implementation of the motion capturing system, the simulation, Work Package 4 is concerned with the collection and analysis of user’s satisfaction of

WP 2

WP 3 and the questionnaire.

WP 4 interactive music/dance systems.

Figure 2 - Overview of internal and external deliverables

Work Package 1: Dissemination
Tim Berners-Lee describes Web 3.0 as “the web [in which computers] become capable of analyzing all the data on the Web – the content, links, and transactions between people and computers. […] Dayto-day mechanisms of trade, bureaucracy and our daily lives will be handled by machines talking to machines” This research will not build a web application, but will mobilize and interact with on-line communities using Web 3.0 technologies to foster social interaction and exchange between people and computers both on-line and off-line.

Interactive Music 3.0 Mission|Action Plan|Work Packages|Timeline – WP 1

Work Plan Yago de Quay

My strategy is to empower researchers with the necessary knowledge and tools to effectively implement the methodology themselves, and foster massive participation of user-generated content from the scientific community. It will include Blogs, group-related or personal online journals, where people can post ideas, images, and links to other web pages or sites related with their role in this research as well as micro-blogging tools, like Twitter, in a stream of constantly-updated information. Past experiences included the collaborations of social media platforms such as and This collaborative environment, where users are expected to co-create the solutions, leads to a natural acceptance by the stakeholders who will feel empowered to test, evaluate and report their own experience with the new solutions in a controlled social environment.

Centralized development approach Technology-driven Broadcast Top-Down Low communication effectiveness High costs of content production Low participation Limited growth

Mission loosely-coupled integration Demand-driven Participation Bottom Up / Edge to Core High communication effectiveness Very low costs of content production Exponentially high participation Unlimited growth

Figure 3 - Broadcast vs. User centric approaches

This proposal is positioned closer to the demand of new content and services related with technologyenhanced learning where a significant percentage of the content will be generated by intermediate and end users/researchers. As an example, most of my projects are made publicly available and count with a growing community of users and contributors. The Advisory Board is a project management instrument that provides an additional layer of quality by having external experts to the consortium assess the progress and research outputs of the project. The advisory board has been carefully selected with individuals with strong networks that will assist the

Advisory Board

Interactive Music 3.0 Mission|Action Plan|Work Packages|Timeline – WP 2

Work Plan Yago de Quay

project in fomenting the growth of communities and in identifying early adopters amongst universities and enterprises. Possible Candidates for Publications • SMC • NIME • ICMC • CMMR • Computer Music Journal • Journal of New Music Research Exhibitions • NIME • Re-new • ARS Electronica Web • wordpress • Twitter • Facebook • FriendFeed • Scribd • Slideshare • Vimeo & Youtube • Flickr • Grooveshark •

Work Package 2: Theoretical and Methodological Framework
This initial research step will define the research strategy to be adopted. Considering the various current forms of music interaction and motion capture, use cases and community feedback will help steer the research practice. Work Package 2 will figure out the best way to merge the areas of gesture analysis, music composition and dance. This is important not only for the purpose of this mission, but for the scientific community at large; for there has been limited research on music information retrieval from single and, to a lesser extent, group gestures (Godøy and Jensenius 2009). Furthermore, the theoretical and methodological model hopes to investigate the following issues: First, the limitation in the method of music creation. Engineered music, contrary to interactive music, only allows the expert to have a say on the overall creative direction of a song. Second, noninteractive systems miss out on the plethora of knowledge and inputs that other users and experts can provide. Third, competition and innovation in this field lowers barriers to entry otherwise imposed by high financial costs and lack of an open knowledge community (Chin 2006). This research will solve these problems by offering an interactive music model for multi-participative environments. This interactive model is not intended to substitute performers, experts or engineered content, but intended to add a tool for public interactive media, a tool that can have benefits on education, art and the scientific community.


Interactive Music 3.0 Mission|Action Plan|Work Packages|Timeline – WP 3

Work Plan Yago de Quay

Work Package 3: Motion Capture System
The real power behind the semantic relationships between music and movement will be realized when people wear sensors or agents that collect information, process it and share the results with other people or other things. Currently, body-oriented systems involve extracting data directly from the dancer such as brain waves, heart rate and skin conductivity (Wechsler 2011). Environmentoriented methods position the dancers in a 3D movement sensitive grid by using motion tracking devices such as video cameras, sonar, and laser beams. Because dance is mostly involved with fluid body movement, the trend goes towards environment-oriented and non-intrusive techniques that remove the burden of the dancers carrying wires and devices attached to their bodies. Below are pictures of two motion capture system which I was involved that respond to body motion with sound effects.

Figure 3 – Oslo: Dance Jockey (Xsens) | Interactive Nightclub (Optitrack)

Work Package 4: Data Collection & Analysis
Questionnaires and opinion surveys will rate music appreciation as well as dance appreciation. The nature of data will be both qualitative and quantitative. Qualitative data • Videos of users interacting with motion capturing system (MOCAP); • Recorded music material created by users; • Performance tests will evaluate the degree of sensitivity between the subjects interaction with the _.affected music. This will fine tune parameters in the music software. Quantitative data • Behaviour observation checklists will be used to evaluate dancers' movements and provide _.information on what movement to capture; • Questionnaires and opinion surveys will rate music appreciation as well as dance appreciation; • Likert scale will allow participants to self-rate how much control they had on the music.

Interactive Music 3.0 Mission|Action Plan|Work Packages|Timeline – WP 4

Work Plan Yago de Quay

After the motion capturing system has been implemented, the testing will proceed as follows: Two random samples will be created from a population of club goers. Demographic questionnaires will filter subjects to a sample representative of club goers. The sample would then be split in to two conditions. Under these conditions participants will repeat various sets. Each set has a length of 10 minutes and the music played will have one of three levels of interaction assigned randomly: Noninteractive, interactive and very interactive. The unit of observation will depend on the availability of participants.

Figure 4 - Visual representation of experiment sets

Under condition A, participants will dance in various sets and rate the experience after every set. This will be a blind study; they will not be told which sets are interactive. Under condition B, different participants will dance in various sets and rate the experience after every set. They will be told which sets are interactive. In conclusion, wearable agents will help secure “the long tail”, that is, the collective, pervasive power of the small interactions that make up the bulk of the system’s content. These will serve as the basis for a complex musical system that emerges from the continuous interaction with other people and things. In the end, this PhD and all its contributors will elaborate on methods for machine learning, develop motion capture devices for people with disabilities, create ontologies that define relations between musical concepts and diffuse the result through social networks. I strongly believe that Interactive Music 3.0 can provide a real-world extension for musical interaction in mainstream nightclubs. As a musician, I have learned that, throughout the history of mankind, music and technology co-evolve shaping, and being shaped by, human expression and creativity. The variety and intricacy of these recombination processes contribute profoundly to the current diversity of performative structures and meanings within the arts.


Interactive Music 3.0 Mission|Action Plan|Work Packages|Timeline

Work Plan Yago de Quay

Semester 1

Semester 2

Semester 3

Semester 4

Semester 5

Semester 6

Semester 7

Semester 8

dissemination describe use case alpha testing gesture taxonomy music composition strategy gesture mapping beta testing assemble mocap system usability testing run experiment questionnaire


data analysis further work

WP1 – WP1 – theoretical & methodological dissemination framework

WP2 – motion capture

WP4 – data collection & analysis

Figure 5 - Action Timeline


Interactive Music 3.0 Mission|Action Plan|Work Packages|Timeline

Work Plan Yago de Quay

Bayliss, A., S. Lock, and J.G. Sheridan. 2005. Augmenting expectation in playful arena performances with ubiquitous intimate technologies. Paper read at PixelRaiders 2, 6-8 April 2004, at Sheffield. Bertini, Graziano, Massimo Magrini, and Leonello Tarabella. 2006. An Interactive Musical Exhibit Based on Infrared Sensors. In Computer Music Modeling and Retrieval, edited by R. Kronland-Martinet, T. Voinier and S. Ystad: Springer Berlin / Heidelberg. Chin, Paul. 2006. The Value of User-Generated Content. Intranet Journal. Futrelle, Joe, and Stephen Downie. 2003. Interdisciplinary Research Issues in Music Information Retrieval: ISMIR 2002. Journal of New Music Research 32 (2):121-131. Gates, Carrie, Sriram Subramanian, and Carl Gutwin. 2006. DJs' perspectives on interaction and awareness in nightclubs. In Proceedings of the 6th conference on Designing Interactive systems. University Park, PA, USA: ACM. Godøy, Rolf Inge, and Alexander Refsum Jensenius. 2009. Body Movement in Music Information Retrieval. Oslo: The University of Oslo. Lee, Eric, Teresa Marrin Nakra, and Jan Borchers. 2004. You're the conductor: a realistic interactive conducting system for children. In Proceedings of the 2004 conference on New interfaces for musical expression. Hamamatsu, Shizuoka, Japan: National University of Singapore. O'Reilly, Tim. 2009. What Is Web 2.0. O'Reilly Network 2006 [cited May 30 2009]. Available from Ulyate, Ryan, and David Bianciardi. 2001. The interactive dance club: avoiding chaos in a multi participant environment. Paper read at NIME '01: Proceedings of the 2001 conference on New interfaces for musical expression. Wechsler, Robert. 2011. Palindrome Intermedia Performance Group 2011 [cited March 10 2011]. Available from