Professional Documents
Culture Documents
net/publication/280852042
CITATIONS READS
6 115
4 authors:
95 PUBLICATIONS 476 CITATIONS
Institut de Recherche et Coordination Acoustique/Musique
205 PUBLICATIONS 5,477 CITATIONS
SEE PROFILE
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Dominique Fober on 14 September 2015.
OSC address
hostname:port
Figure 4. Addressing scheme extension.
Figure 2. Alien Lands : a complex automatic music score.
Example 2
Initializes a score with a Guido Music Notation file [19]
3. MUSIC SCORE DESIGN USING MESSAGES and sends a message to an external application listening on
port 12000 on a station named host.adomain.net. The
The basic principle for the description of a music score semicolon (;) is used as a message terminator in a script.
consists in sending OSC messages to the system to cre-
ate the different score components and to control their at- /ITL/scene/score set gmnf ’myscore.gmn’;
tributes, both in graphic and time spaces. host.adomain.net:12000/run 1;
3.2.2 Variables has been extended to gesture events, described in section
6.3.
Variables have been introduced to allow sharing of param-
eters between messages. A variable associates an identifier
Graphic domain Time domain
and a parameter list or a list of messages (Figure 5). Vari-
mouseDown timeEnter
ables can be used as message parameter using the form
mouseUp timeLeave
$identifier.
mouseEnter durEnter
mouseLeave durLeave
mouseMove
identifier = param
lua
/ITL/scene/obj watch mouseDown (
/ITL/scene/obj x ’$sx’,
/ITL/scene/obj y ’$sy’ );
Figure 6. Languages.
• $likelihood : gives the current gesture likelihood, [3] T. Magnusson, “Algorithms as scores: Coding live
music,” Leonardo Music Journal, vol. 21, pp. 19–23,
• $pos : indicates the current position in the gesture, 2011.
[4] J. Freeman, “Bringing instrumental musicians into in- [15] B. Höhrmann, P. Le Hégaret, and T. Pixley, “Docu-
teractive music systems through notation,” Leonardo ment object model (dom) level 3 events specification,”
Music Journal, vol. 21, no. 15-16, 2011. World Wide Web Consortium, Working Draft WD-
DOM-Level-3-Events-20071221, December 2007.
[5] ——, “Extreme sight-reading, mediated expression,
and audience participation: Real-time music notation [16] A. Allombert, “Aspects temporels d’un système
in live performance,” Comput. Music J., vol. 32, de partitions musicales interactives pour la com-
no. 3, pp. 25–41, Sep. 2008. [Online]. Available: position et l’exécution,” Ph.D. dissertation, Univer-
http://dx.doi.org/10.1162/comj.2008.32.3.25 sité Bordeaux 1, Oct. 2009. [Online]. Available:
http://tel.archives-ouvertes.fr/tel-00516350
[6] D. Kim-Boyle, “Musical score generation in valses
[17] A. Cont, “Antescofo: Anticipatory synchronization
and etudes,” in Proceedings of the 2005 conference
and control of interactive parameters in computer mu-
on New interfaces for musical expression, ser. NIME
sic.” in Proceedings of International Computer Music
’05. Singapore, Singapore: National University of
Conference, ICMA, Ed., 2008.
Singapore, 2005, pp. 238–239. [Online]. Available:
http://dl.acm.org/citation.cfm?id=1085939.1086007 [18] R. Hoadley, “Calder’s violin: Real-time notation
and performance through musically expressive algo-
[7] G. E. Winkler, “The realtime score. A missing link in rithms,” in Proceedings of International Computer Mu-
computer music performance.” in Proceedings of the sic Conference, ICMA, Ed., 2012, pp. 188–193.
Sound and Music Computing conference - SMC’04,
2004. [19] H. Hoos, K. Hamel, K. Renz, and J. Kilian, “The
GUIDO Music Notation Format - a Novel Approach
[8] J. Gutknecht, A. Clay, and T. Frey, “Goingpublik: for Adequately Representing Score-level Music.” in
using realtime global score synthesis,” in Proceedings Proceedings of the International Computer Music Con-
of the 2005 conference on New interfaces for musical ference. ICMA, 1998, pp. 451–454.
expression, ser. NIME ’05. Singapore, Singapore:
National University of Singapore, 2005, pp. 148–151. [20] F. Bevilacqua, B. Zamborlin, A. Sypniewski,
[Online]. Available: http://dl.acm.org/citation.cfm?id= N. Schnell, F. Guédy, and N. Rasamimanana, “Contin-
1085939.1085980 uous realtime gesture following and recognition,” in
In Embodied Communication and Human-Computer
[9] N. Didkovsky, “Recent compositions and performance Interaction, volume 5934 of Lecture Notes in Com-
instruments realized in the java music specification lan- puter Science. Springer Berlin / Heidelberg, 2010,
guage,” in Proceedings of the 2004 international com- pp. 73—84.
puter music conference, 2004, pp. 746–749.
[21] F. Bevilacqua, N. Schnell, N. Rasamimanana, B. Zam-
[10] N. Didkovsky and P. Burk, “Java music specification borlin, and F. Guédy, “Online gesture analysis
language, an introduction and overview,” in Proceed- and control of audio processing,” in Musical
ings of the International Computer Music Conference, Robots and Interactive Multimodal Systems, ser.
2001, pp. 123–126. Springer Tracts in Advanced Robotics, J. Solis
and K. Ng, Eds. Springer Berlin Heidelberg,
[11] H.-W. Nienhuys and J. Nieuwenhuizen, “LilyPond, a 2011, vol. 74, pp. 127–142. [Online]. Available:
system for automated music engraving,” in Proceed- http://dx.doi.org/10.1007/978-3-642-22291-7 8
ings of the XIV Colloquium on Musical Informatics,
2003.