An Enactive Approach to Digital Musical Instrument Design | Embodied Cognition | Technology

An Enactive Approach to Digital Musical Instrument Design

Newton Armstrong

A DISSERTATION PRESENTED TO THE FACULTY OF PRINCETON UNIVERSITY IN CANDIDACY FOR THE DEGREE OF DOCTOR OF PHILOSOPHY

RECOMMENDED FOR ACCEPTANCE BY THE DEPARTMENT OF MUSIC

November 2006

© Copyright by Newton Blaire Armstrong, 2006. All rights reserved.

Table of Contents

Table of Contents ............................................................................ iii Abstract ..........................................................................................v Acknowledgements ......................................................................... vii 1 Introduction............................................................................... 1 1.1 1.2 1.3 1.4 2 The Disconnect .........................................................................2 Flow ........................................................................................6 The Criteria of Embodied Activity ................................................8 The Computer-as-it-comes....................................................... 12

The Interface ............................................................................16 2.1 2.2 2.3 2.4 2.5 2.6 Interaction and Indirection....................................................... 16 Representation and Cognitive Steering ...................................... 18 Computationalism ................................................................... 23 Sensing and Acting ................................................................. 32 Functional and Realizational Interfaces ...................................... 41 Conclusion ............................................................................. 49

3

Enaction ....................................................................................51 3.1 3.2 Two Persistent Dualisms .......................................................... 51 Double Embodiment ................................................................ 55
iii

..............................................................................................6 4 Structural Coupling .......... 82 Conclusion ........ Feely: Software.................3 4.............................. 155 iv ...............................1 4............................................ 69 The Discontinuous Unfolding of Skill Acquisition ..............................4 4. Feely: Usage Examples ......... 148 5 Groundlessness ...... Feely: Hardware.........152 Bibliography........................................................4 3.... 128 Prospects ...................................................................................................3. 61 Towards an Enactive Model of Interaction ......... 100 Mr...........5 Kinds of Resistance .....................5 3........................................................................................ 103 Mr.................2 4..................3 3..............................................................................100 4.......... 116 Mr........................... 98 Implementation ..........................

Abstract

Digital musical instruments bring about problems for performance that are different in kind to those brought about by conventional acoustic instruments. In this essay, I argue that one of the most significant of these problems is the way in which conventional computer interfaces preclude embodied modes of interaction. I examine the theoretical and technological foundations of this “disconnect” between performer and instrument, and sketch an outline for the design of embodied or “enactive” digital instruments. My research builds on recent work in human-computer interaction and “soft” artificial intelligence, and is informed by the phenomenology of Heidegger and Merleau-Ponty, as well as the “enactive cognitive science” of Francisco Varela and others. I examine the ways in which the conventional metaphors of computer science and “hard” artificial intelligence derive from a mechanistic model of human reasoning, and I outline how this model has informed the design of interfaces that inevitably lead to disembodied actional modes. I propose an alternative model of interaction that draws on various threads from the work of Heidegger, Merleau-Ponty, and the enactive cognitive scientists. The “enactive model of interaction” that I propose is concerned with circular chains of embodied interdependency between performer and instrument, instrumental “resistance” to human action and intentionality, and an integrative approach to the roles of sensing, acting and cognitive process in the incremental acquisition of performative skill.

v

The final component of the essay is concerned with issues of implementation. I detail a project in hardware and software that I present as a candidate “enactive digital musical instrument,” I outline some specific usage examples, and I discuss prospects for future work.

vi

Acknowledgements

This paper would have been a much bigger mess were it not for the timely contributions of a number of people. In particular, I have benefited from the very careful readings and insightful criticisms of my advisor, Barbara White, and my first reader, Dan Trueman. Paul Lansky has uttered more wise words than I could count, and he has changed my mind about many things during my time at Princeton (although, as far as I can tell, that was never really his intention). Perry Cook has taught me a great deal about interaction, both in his classes and in the approach to design that he takes in his own projects. He has been an outstanding role model in terms of bridging the gap between theory and practice, and knowing when it’s time to just sit down with a soldering iron. I have also benefited greatly from conversations with other graduate students while at Princeton. In particular, I’d like to thank Ted Coffey, Paul Audi, Mary Noble, Seth Cluett, Scott Smallwood and Ge Wang, each of whom has given me feedback on my work, in the form of both critical readings and more casual conversations about the core topics. I’m also grateful to the other composers in my intake year: Paul Botelho, Stefan Weisman and Miriama Young. Together we represent a diverse group, but there has been a considerable and on-going interest in each other’s work, and this interest has been borne out in tangible forms of support for our respective projects and activities. The history of electronic music performance goes largely without mention in my paper. But the research would not have been possible in the first place were

vii

I am indebted to all those electronic performers whose work I have engaged. from David Tudor to Toshimaru Nakamura. whether through written accounts and recordings.it not for those practitioners. or through personal contact and performance collaborations. but new potentialities of the body. I’m looking forward to rejoining the ranks of the improvising community in a less part-time capacity. Although my fingers are rusty from typing. viii . who would question the hidden nature of electronic media in order to uncover not just new sounds.

That the physical. Some New and Old Thoughts After and Before “The Bewitched” 1 . but the implication is irrelevant. but with little else— no eyes. for people with ears. It says that music is a pure art of sound. even misleading. And as a matter of fact much electronic music leaves the impression that this IS the attitude in which sounds are composed. are obsolescent.1 Introduction Electronics for its own sounds’ sake is a resource that one would be stupid to dismiss. sensual vision of the playing of it is no longer required. no nerve endings anywhere but the ears. and that the techniques developed on it. It says that the functional shape of an instrument is not important as a sculptural object. — Harry Partch. no interrelated functions. because of its particular virtues and its particular defects.

a capability that had previously been the reserve of special purpose machines that were for the most part inaccessible to people working outside an institutional framework. there has been a rapid proliferation of new software and input devices designed specifically for musical performance with general purpose computers. performance practices and musical idioms have emerged in tandem to the new technologies. these controversies revolve around the relationship between the human performer and the performance medium. they revolve around an apparent lack of embodied human presence 2 . But while the widespread availability of the personal computer to the first world middle class has resulted in the medium finding its way into any number of new and diverse musical contexts. at least in certain quarters. for sight. More often than not. The Phenomenology of Perception The mid-1990s marked a juncture in the short history of computer music. For the first time. Or. the personal computer was becoming fast enough to be used as a realtime synthesizer of sound. to generate some controversy. In the years since the mid-1990s. A body at rest because no force is being exerted upon it is again for sight not the same thing as a body in which opposing forces are in equilibrium. the same thing as a wheel bearing a load.1. and a burgeoning corpus of new theories. the question as to whether the computer should be properly considered a musical instrument continues. — Maurice Merleau-Ponty. more specifically.1 The Disconnect A wooden wheel placed on the ground is not.

In both instances. The argument has it that the computer. and that the attributes of the medium necessitate a break with established instrumental conventions. and that for such an involvement to be tangible to the audience. where the absence of any explicit correlation between motor input and sonic output results in a disassociation of performer from performance medium. what is witnessed is a disconnect. The complainants argue that the performer is either absorbed in near-motionless contemplation of the computer screen—the repertoire of performance gestures not substantively different from those that comprise any routine interaction with a personal computer—or that there is a high degree of arbitrariness to the performer’s actions. in real time and real space. the modes of performance that are attendant to those 3 . but because conventional expectations as regards the constitutive elements of musical performance have not yet caught up to an essentially new performance practice (Cascone 2000. Defenders of the “near-motionless” school of computer music performance have suggested that complaints such as these arise not because there is something substantive missing from the interaction between performer and performance medium. Stuart 2003).and involvement in computer music performance practice. between performer and audience. it’s necessary that that audience picks up on somatic cues that signal the point of origin. Those who complain about the current state of computer music performance practice reveal something of their assumptions and expectations as regards musical performance: that the involvement of the performer’s body constitutes a critical dimension of the practice. of the sounds they are hearing. considered as a performance medium. and between performer and instrument. brings a unique set of issues and concerns to the problem of musical performance.

or with the audient. I’ve found that the performance medium has in all but a few instances managed to maintain a safe distance. there is a suggestion that something is missing. What distinguishes one side from the other is where that missing something is located: with the performer.” and that the emergence of the new performance paradigm signals a shift away from the locus of the body of the performer. and that most of the interesting and significant work in the field remains to be done. In certain respects. and the expectations. 4 . But more pressingly.conventions. who needs to relearn “active” modes of listening. assumptions and receptive habits of audiences. it is borne out of frustration as a computer music performer. as it stands. that is. then.” On both sides of the argument over the state of computer music performance practice. is already mature. It’s difficult to defend either position. But what can be called into question is the implied corollary to the apologist’s claim that the burden of responsibility lies with the audient. In this essay. I take the opposite position: that computer music performance practice remains both theoretically and technologically under-developed. Despite investing a number of years in the development of both hardware and software designed specifically for performance. the present study is a legitimation of the complaints being uttered against the current state of computer music performance practice. the “object of performance” is instead transferred to the ears of the audient. or “aural performativity (Stuart 2003). that computer music performance practice. It’s been suggested that those who take issue with the apparent lack of human motor involvement in current computer music performance practice reveal a mindset “created by constant immersion in pop media (Cascone 2000: 101-102). based as they are on speculative assessments of the receptive habits and practices of listeners.

Rather.” that certain people have been complaining about. is not due to a conditioned desire for spectacle. and significant form of music making. or “missing dimension. in order to determine what can be done to engender the technical conditions from which an embodied performance practice might arise. If it turns out that is in fact the case. something embedded in the medium itself.corroborating (from the shaky perspective of first person phenomenal experience) the complaint of the disconnect. that is the cause of all this. or an ingrained expectation that an explicitly causal relation is witnessed between performance gesture and sonic result. dynamic. something that necessarily and inevitably brings about a disconnect. I’m going to suggest that the perceived disconnect. then the medium deserves to be examined. If the attributes of the computer preclude such a mode of performance. the audience. it seems to me that there is something more fundamental to the issue: that an engaged and embodied mode of performance leads to a more compelling. then the medium effectively guarantees that an embodied coupling of human and instrument—a coupling that creates the possibility of engaged and involved experience—never quite takes place. for the performer. I’ve come to believe that there is something intrinsic to the computer. 5 . Unlike the apologists for the currently predominant modes of computer music performance practice. and for the social space that they co-construct through the performance ritual.

and their responses are tightly correlated to the variety of inputs from the performer’s body that are afforded by the mechanism." in the sense that I will use the term.1. the acoustic space. the term accounts for the particular 6 .1 It’s a way of being that consists in the merging of action and awareness. the social setting. and other providers of context). De Musica Performers of conventional acoustic instruments often talk of the sense of flow they experience while playing.2 In a sequence of on-going negotiations between performer and 1 For a more complete account of "flow. Flow: The Psychology of Optimal Experience (Csikszentmihaly 1991).2 Flow The matter of music is sound and body motion. 2 The notion of "affordance" was introduced by psychologist James Gibson (Gibson 1977. 1979). For a concise summation of the applicability of Csikszentmihaly's ideas to instrumental performance see Burzik's "Go with the flow" (Burzik 2003). It’s the kind of absorbing experience that can arise in the directed exchange between an embodied agent and a physical mechanism. In the Gibsonian sense. As such. and the loss of any immediate sense of severance between agent (the performer) and environment (the instrument. Conventional acoustic instruments offer resistance to the body of the performer. see Mihaly Csikszentmihaly. — Aristides. and it’s a coupling that happens as a matter of course with acoustic instruments. an affordance is an opportunity for action that the environment presents to an embodied agent.

and to a heightened sense of embodiment. to a human a chair affords sitting. Performance with a conventional acoustic instrument serves as a useful example of an embodied mode of human activity. But where those writers focus on the shortcomings of the audience. but to a woodpecker it may afford something quite different (Clark 1997:172). or in modeling acoustic instruments in the digital domain. But in the context of the present study. To borrow an example from Andy Clark: ". continually developing new forms of embodied knowledge and competence. considered as a performance medium. I focus on the shortcomings of current theory and technologies.. I hold that the computer. as well as her intentionality. presents new and unique problems and prospects. and because it seems that the computer has a way of both limiting the body’s possibilities and diminishing its potential for resistance. these negotiations lead to a more fully developed relationship with the instrument. physical and perceptual attributes and abilities of the agent. or flow. and on the body of the performer—not because of the body’s historical coupling to conventional instruments." 7 .instrument. Over a sustained period of time. the performer adapts to what is uncovered in the act of playing.. Along with those writers who would proclaim the advent of a new computer music performance practice. I’m not specifically interested in appropriating the conventions of acoustic instrumental practice for computer music. and of an engaged coupling with a complex physical mechanism. but because I choose to conceive of the body as a site of possibility and resistance.

of a heightened sense of embodiment. more specifically. in which meaning and purpose arise not through abstract contemplation. If the computer is going to figure as a musical instrument. I’ll return to what I take to be the five key criteria of embodied musical performance. immediate and engaging. to the experience of playing a conventional acoustic instrument that are pertinent to thinking about the design of digital musical instruments that would allow for embodied modes of performance. or. Traditional though it may seem. but directly within the course of action. the five key criteria of 8 . to musical performance. It amounts to a presence and participation in the world. The optimal performative experience—this somewhat intangible and elusive notion of flow—could be characterized as a way of being that is so direct. space and the self. the experience of flow. these are qualities that I believe are central. are put temporarily on hold.3 The Criteria of Embodied Activity Over the course of this essay. the motor system (muscles). and will remain central. involves an immediately palpable feeling of active presence in a world that is directly lived and experienced.There are attributes. and if it does not presently lend itself to embodied form of interaction. that the normative senses of time. Such action involves complex and continuous exchanges and interactions between senses. then. 1. and the social and physical environment in which the ritualised act of performance is embedded. then some work needs to be done. In short. in experiential real time and real space. the nervous system (including the brain).

without full prior knowledge of the features of the environment. or of its structure and dynamics. and in her relationship to it. their placements at regular intervals on the belt. (Sudnow 2001:32-3) 9 . and the agent must be able to meet these constraints in a timely manner. Embodied activity is timely. because the upcoming flow seems to gain speed and he gets frantic. 2. falling behind the time. This means that it is incumbent on the agent to not disrupt the flow of activity because her capacity for action is too slow. Embodied activity is multimodal.3 3. The agent must be able to adapt to changes in the environment. Embodied activity is situated. missing one or two along the way. eventually caught up in the machine and ejected onto the factory floor in his hysterical epileptic dance. Chaplin holding these two wrenches. Those criteria are: 1. rushing to catch up. A large portion of the agent’s total sensorimotor capabilities are galvanised in performance.the particular kind of embodied mode of interaction with digital musical instruments that I hope to uncover through outlining a philosophically informed approach to instrument design. screwing bolts faster to stay ahead of the work. Real world activity involves real-time constraints. Embodiment arises contextually. through an agent’s interactions with her environment. This involves 3 David Sudnow uses a nice example of untimely behavior in Ways of the Hand: Recall Charlie Chaplin on the assembly line in Modern Times: the conveyor belt continuously carrying a moving collection of nuts and bolts to be tightened. or because it actually does speed up.

The implications of this double sense of embodiment—of its "inner" and "outer" aspects—are explored in Chapter 3. optimal embodied experience arises incrementally over a history of sensorimotor performances within a given environment or phenomenal domain. Evan Thompson and Eleanor Rosch (Varela. 5.optimising the use of the body’s total available resources for cognition. whereas the fact of embodiment is objective." But this is potentially misleading. and it presents challenges to the agent that consume a large portion of her attention. 10 . That is. with an emphasis on the concurrent utilisation of distinct sensorimotor modalities. The sense of embodiment is an emergent phenomenon. Thompson. as embodiment is a given for biological systems. but for the time being it’s useful 4 This criterion could perhaps have been condensed into the phrase "embodiment is an emergent phenomenon. i. as well as the potential for mutual interaction. I will refer to the embodied mode of performative activity I’m outlining here as enactive. action and perception. The sense of embodiment. Borrowing from cognitive scientists Francisco Varela. the environment is incomplete without the involvement of the agent. and Rosch 1991).4 That is. I’ll address the concept of enaction in more depth in Chapter 3.e.. between those modalities. or cross-coupling. then. 4. is phenomenal. living organisms do not emerge into their bodies. There is a link between increasing sensorimotor competence within the task domain and the sense of embodiment. The sense of embodiment arises when the agent is required by the task domain. Embodied activity is engaging.

11 . This encompasses the dynamics of the experiential present. Rosch and Thompson's The Embodied Mind (Varela. In the enactive view. the cognitive dimension of activity. Thompson. To that extent. with its emphasis on bodily involvement in the “bringing forth of a world. the enactive perspective takes the repeated sensorimotor interactions between the agent and the environment as the fundamental locus of cognitive development. circular.e. the “now” of lived experience. i. cognition is fundamentally an embodied phenomenon. In contrast to orthodox views of mental process that view cognition as the internal mirroring of an objective external world. it arises through and within an agent’s physical interactions with her environment. and cognitive systems and structures.” but it also encompasses the emergence and development of knowledge and competence. of an instantaneous conceptual and corporeal disposition within a given environment.” It’s an ongoing. in turn. plays a determining role in the emergence of cognitive systems and structures.. This model of cognition. that which is ineluctably the “now.”5 provides a template for the performance practice that I hope will emerge from this study. i. and Rosch 1991).. 5 The expression is borrowed from Varela. play a determining role in constituting the “now. and fully reciprocal process of mutual determination and specification in which subjectivity and the sense of embodiment are in a continuous state of flux.to emphasize the centrality of the body to the enactive model of cognition.e.

2001. Winograd and Flores 1986) have shown that it is no easy task to design computing devices that would allow for embodied modes of interaction. cybernetics failed to provide the necessary empowerment for the emerging science of computation and so was lost. or that lead to a heightened sense of embodiment over a history of interactions.e. and engaging. the field as a whole has not been immune to 6 The "prevailing guiding metaphors" of CS and HCI—i. the epistemological underpinnings of what I have labelled "conventional" CS and HCI—will be outlined in terms of a computationalist ontology in Chapter 2.1. The nascent field of computational science was set on a steady path. dominated by the computational metaphor. 1997. These were precisely the issues suppressed by the computationalist approaches. And while the subset of computing devices that is of specific interest to this essay—digital musical instruments—is these days comprised of a vast and diverse array of implementations. Dourish 1999. The prevailing guiding metaphors of computer science (CS) and human computer interaction (HCI)6 are at odds with the embodied/enactive approach. Clancey 1997. and routinely preclude modes of interaction that are situated. Lynn Andrea Stein has suggested that it was a matter of historical contingency that saw the computationalist approach hold sway in the formative days of computer science: Cybernetics took seriously the idea of a computation embedded in and coupled to its environment. In the intellectual battles of mid-century.. (Stein 1999:482) 12 .4 The Computer-as-it-comes A number of authors (Agre 1995. but its connections to the world around it were weakened. multimodal. timely. Stein 1999.

”7 and has given rise to a so-called “laptop aesthetic (Jaeger 2003). and I will refer to it under the (intentionally) broad term of “digital musical instruments. or even completely reconfigure them. This practice is often encapsulated under the rubric of “laptop music. 13 .” A third current could also be identified.” in which the computer is used as a signal processing add-on or improvising partner to a conventional 7 The term "laptop music" surfaced in the second half of the 1990s. of “extended acoustic instruments.” The second of the two currents is defined precisely through its non-acceptance of the “computer-as-it-comes” as a musical instrument. those instruments that have managed to realize this potential have done so despite the conventional tenets of CS and HCI. and would normally be characterized by the “nearmotionless” mode of performance described earlier in the chapter. The first of these would take the personal computer more or less as it comes (with minimal or zero additions to the standard input devices). It may be useful to distinguish between two main currents in present day computer music performance practice. 2003. see the articles collected in Contemporary Music Review 22 (4). But rather. the practitioners seek to extend computing devices. This is the field of activity to which my own work belongs. through the development and integration of new technologies designed specifically for musical performance. at around the same time that the first "laptop performers" began to appear. This is not to say that all digital musical instruments have failed to realize the potential for embodied modes of interaction. For a diverse range of assessments of laptop performance practice and its reception.the guiding metaphors of conventional CS and HCI. Rather.

. What I intend to denote is not so much a specific device (although it could be). But as the presence of the acoustic instrument already invokes the potential for embodied performance. and the same computer that those working towards “digital musical instruments” would seek to re-engineer in order to arrive at embodied modes of performance. but rather a general notion of the more or less generic personal computer. While this has lead to numerous innovations in both the theory and technology of computer music performance. the technological instantiation of the conventional guiding metaphors of CS and HCI. There has also been a steadily growing corpus of scholarly articles. the limits and potentialities of the current computational media—i. I believe that the most pressing issues in arriving at designs that allow for embodied forms of musical interaction with computers are philosophical. and that in order to arrive at sustainable designs for enactive instruments.acoustic instrument. The tendency in digital musical instrument design has been to focus on the pragmatic issues of design: specific sensor and actuator technologies. this area of practice is not of specific relevance to the present study. This is the computer that “laptop music” adopts wholesale into its performance practice.e. There has been a great deal of activity in recent years in the development of new digital musical instruments. research papers and theses on issues in live computer music. audio 14 . the defining attributes of the computeras-it-comes—need to be examined in philosophical terms. The “computer-as-it-comes” is a term that will appear throughout this essay. there remains a near total absence of work related specifically to the philosophical foundations of instrument design.

designers end up drawing on the conventional patterns of use without proper consideration of the implications of those patterns for the end user. then that medium needs to be examined with a philosophical perspective in order to arrive at a better understanding of the ways in which it determines its patterns of use. it seems to me that the shift of emphasis is potentially very useful. mapping strategies. and models of interaction. there is a greater likelihood that designers will unwittingly fall back on the received tenets of CS and HCI. Without addressing these issues at some point. and. This is the first step towards rethinking and reconfiguring those patterns. reflecting world views. all too regularly. and so on.synthesis methods. that are immanent in designs. and. these implications are philosophical in origin. behavior and cognition. and towards arriving at designs that are more fully and properly geared towards the requirements and desires of embodied human actors. While there is a great deal of overlap between the pragmatic and the foundational issues. Without proper attention to the foundational issues. 15 . But in this essay I focus more on the theoretical and foundational issues of design. for example—and if it does so because of the world models that are embedded in its very mechanism. As I will endeavour to show. If a medium precludes a desired usage— an embodied mode of interaction. in turn. in the technological artifacts that result from those designs. with a view to providing a conceptual touching stone for the pragmatic stage. be no digital musical instruments of which to speak. even though those tenets may (and more often than not will) work against the bringing into being of enactive instruments. of course. there will. The personal computer brings with it a sizable repertoire of usage conventions.

is required.2 The Interface Musical ideas are prisoners. Interactions between a human and a computer are conducted through an interface. more than one might believe. loudspeakers and printers) transmit human-decodable respresentations of the state of the running programs from the computer back to the user. it consists in providing an appropriate abstraction of computational data and tasks to the user. Output devices (such as monitors. Traite des Objets Sonore 2. Input devices (such as keyboards and mice) capture signals from the user that are mapped. of musical devices. in the first instance. — Pierre Schaeffer.1 Interaction and Indirection Interaction takes place when signals are passed back and forth between two or more entities. 16 . to changes in the state of computer programs. Human-computer interface design is therefore concerned with providing the user with a set of usage practices. through the interface abstraction layer. The interface provides the human with a means of access to the programs running on the computer. protocols and procedures appropriate to the task domain for which the interface.

17 . and the way in which that task is conceived by the user. say. Rather. is correlated within the user’s cognitive apparatus to the physical act of hammering. correlated within the cognitive apparatus to the electrical phenomena that constitute the physics of computation. Already. input devices need to 1 The hammer example has figured large in philosophy of technology and media theory since its appearance in Heidegger's Being and Time (Heidegger [1927] 1962) and “The Question Concerning Technology (Heidegger [1949] 1977). then. considered as an interface.1 is the absence of any direct correlation between the physical domain in which a computational task is carried out.One thing that distinguishes the computer from tools such as. Rather. including conventional acoustic musical instruments. The physics of computational media consists in the regulated flow of electrons through circuits. It follows that interactions with a computer are necessarily indirect. the interactional domain needs to be designed. if ever. see Don Ihde's Instrumental Realism (Ihde 1991)." For an interesting analysis of the role that the hammer has played within these discourses. its physical operations are abstracted. in order to accomplish meaningful tasks with computers. But a computer user’s interactions with a computer are rarely. we see the “disconnect” between agent and medium. The hammer. in order for significant interactions to take place. This sets the medium apart not only from the hammer. but from the overwhelming majority of tools that humans use. the canonical example of the hammer. and the human agent does not interact with those circuits in any kind of physically direct manner. and the task domain is presented to the user in the form of graphical and auditory representations. in the distance that the interface imposes between the human and the computer.

be be mapped to tasks and procedures in software. The overriding goal of conventional human-computer interface design is to reduce the inevitable distance between agent and medium. More radical approaches. would seek to embed computing devices directly (and invisibly) within the user's environment (Dourish 2001..o. Weiser 1988. 18 .2 Representation and Cognitive Steering Things is what they things. and software data need to be transmitted to the user in the form of representations. — π. aggregated metaphorical schema—that are customarily (though somewhat inaccurately) characterized as software. Users of 2 This is the express goal of so-called "direct manipulation" interface models (see 2. such as tangible and ubiquitous computing. Weiser and Brown 1996).5 below). Norman 1999. Greenfield 2006. 1991.e. ideally to the extent that the user comes to conceive of the task domain directly in the terms of the representations that comprise the interface. Ullmer and Ishii 2001. reducing the degree of indirection between agent and medium is also the goal of the present study.2 To a certain extent. 2. 1994. (printed on a coffee mug) The computer-as-it-comes packages interface abstractions into representational frameworks—i. But an enactive model of interaction will require an entirely different approach to that taken by conventional HCI.

despite a great deal of attention within the fields of interaction design and 19 . However well-formulated or defined those philosophical systems may be. an encompassing system of metaphors that serves to both guide and regulate the agent’s thoughts and activities through intrinsic correspondences to everyday objects and activities. that serves to facilitate bureaucratic work. and however conscious a designer may be of the philosophical underpinnings of the decisions made during the course of design. The interface amounts to a model of the world. trash cans. Through the set of interactions made available by whichever incidental pre-packaged representational world. workspaces. and the like. This is an unavoidable side-effect of indirection. extrapolated from a real-world task environment that is likely familiar to the user. It’s an unusual transaction that takes place between the designers of computer interfaces and the end users of those interfaces. the user participates in whichever incidental model of the world happens to be implicit to the design. folders. and.personal computers are familiar with the now standard interface metaphors for the routine management and maintenance of their computer systems: files. the transition from design to artifact nonetheless remains loaded with epistemological implications for the end user. It’s a suite of bureaucratic abstractions. keeping the play of regulated voltages—the physical agency through which that work is actually accomplished—well out of the user’s immediate zone of awareness. desktops. Models of the world are born out of philosophical systems.

Feenberg. arrangements within which the agent is potentially free to move. and Agre. and an even larger number of end users.3 it’s a side-effect that remains beyond the bounds of consideration for a large number of designers.technology studies. see Heidegger.” To the same extent that an interface encapsulates a model of the world. In a similar vein to Pierre Bourdieu’s notion of the habitus.” As the interface delineates the conceptual milieu to the user. “The Question Concerning Technology (Heidegger [1949] 1977). it encapsulates a model of 3 In particular. 20 . Through repeated performances. but which at the same time determine the structure and dynamics of those movements. it orients the user’s cognitive activity. “technology at present is covert philosophy (Agre 1997: 240). Computation and Human Experience (Agre 1997).e. and through a chain of subtle reciprocal influences. This is what Merleau-Ponty defines as an incorporating practice. As Philip Agre has put it. a set of implicit assumptions as regards the elements and structure of the task domain begins to solidify.. a process in which actions are literally incorporated—i. registered in corporeal memory—through repeated performances (Merleau-Ponty [1945] 2004). These bodily habits do not so much comprise a catalogue of discrete and distinct states as they do a collection of dispositions and inclinations. there comes to exist “a durably installed generative principle of regulated improvisations (Bourdieu 1977: 78). Critical Theory of Technology (Feenberg 1991). then. the repertoire of meaningful performance actions becomes more or less fixed in bodily habit.

Thompson. See in particular the book's introduction and opening chapter. they are mutually reinforcing. 21 . Thompson and Rosch's The Embodied Mind (Varela. there is a high degree of reciprocal determination and specification between perception. and over the history of an agent’s interactions. the systems and structures that play a determining role in the formation of cognitive patterns are in turn determined by the emergent patterns of interactional dynamics. at the same time that repetitive dispositions towards action and modes of perceiving are engendered within the agent’s sensorimotor mechanisms. “cognitive structures emerge from the recurrent sensorimotor patterns that enable action to be perceptually guided (Varela. encompassing incorporating practices. cognition. when examining an interactional domain with a view to the emergence of cognitive and performative patterns. to draw a hard 4 In Varela.performance. Thompson. then it will make little sense. and Rosch 1991)—the book in which "enactive cognitive science" is first outlined—the authors acknowledge their debt to Merleau-Ponty's phenomenology. If we accept that these dependencies are real. Or to put it another way. Merleau-Ponty’s concept of incorporation is consistent with the enactive model of cognition.” This formulation is essentially a latter day reworking of the fully recursive process. In the enactive view. These dual aspects are inextricably intertwined. In this feedback loop at the heart of the enactive view. and the contingencies of the environment in which perception. action. action and cognitive process are embedded. that Merleau-Ponty defined as the intentional arc4 (Merleau-Ponty [1945] 2004). and Rosch 1991: 172-173).

It would seem that the more closely we examine the interface in use. The affordances of the computer-as-it-comes determine the limits of what is possible within any incidental task domain. These are important concerns not only when arriving at new designs. and the potential implications for the thoughts and actions of the people who will interact with them. The personal computer arrives from the vendor prepackaged with a vast collection of programmed responses. the mouse. the monitor. the more quickly the common notion of the interface as a passive and impartial means to an end begins to break down. It will also make little sense to examine computer interfaces. the interface reveals itself as embodying a theory of knowledge and performance. But it’s how this theory of knowledge and performance is embodied in the interface that is of specific interest to this study. and accomplishes tasks through the agency of the now standard input and display devices—the keyboard. but also when looking at the consequences of existing designs for performance. We come to see that it is far from transparent to the task domain to which it is applied. the user adds to these with the installation of new software. and the loudspeakers. and the user comes to learn. and we begin to understand it “not as an add-on which allows a human to come into relations with an underlying structure. and the metaphorical schema that those interfaces encapsulate. from one piece of software 22 .” At the same time that the boundaries of the user’s potential repertoire of actions and perceptions are determined by the epistemological underpinnings of the representations that comprise the interface. but rather as constitutive of that very structure (Hamman 1997: 40). without due regard to their contingencies and particularities. or between the mind and the body.dividing line between action and cognition.

3 Computationalism While little has been written about the philosophical basis of interaction design with specific regard to digital musical instruments—or even. however. through the agency of software abstractions. we are in need of new metaphors. the kinds of behaviors and outcomes that might be expected to come about as a result of her regulated interactions with the medium. with regard to personal computers in general—it’s nonetheless a topic that has received some considerable attention. For complex. about the computer-as-it-comes that sways the user into a routine-oriented mode of activity. Before heading straight to the drawing board. situated. embodied and real-time forms of activity.to the next. 2. and input and display devices. it’s worth considering what it is. the computer-as-it-comes is a perfectly adequate medium. and thereby precludes the potential for embodied and enactive modes of interaction. The predominant guiding metaphors of human-computer interface design. It may well be that for the majority of tasks for which personal computers are routinely used. particularly over the last fifteen years. exactly. are geared towards routine forms of activity. new ways of thinking about design. and new technologies. for that matter. But I will endeavor to show that it is precisely the models of activity that are embedded in the interface to the computer-as-it-comes that preclude the sense of optimal embodied experience—the sense of flow—that can arise in complex real-time activities such as musical performance with conventional acoustic instruments. in 23 .

the number of environmental variables also increases. This has led to some important questions being raised as regards the traditional foundations of interaction design. the number of conditions that must be encoded in the agent’s representation of the environment increases in geometric proportion.artificial intelligence (AI). 24 . Winograd and Flores (1986). As a succession of AI implementations would bear out. AI theorists and practitioners have been forced to critically re-examine the institutionally endorsed models of perception. In turn.5 Having accomplished so little of what the pioneers of the field promised in the 1960s. As the complexity of the agent’s environment increases. it does not take long before the computational load on the artificial agent ensures against its being capable of the rapid real-time responses that we witness in the various creatures that inhabit the real world. the agent has no capacity for responding to features or obstacles that appear in the environment 5 In particular see Haugeland (1985). action and reasoning that originally appeared to have such vast potential. symbolic representations of real-world task domains must take into account a huge number of environmental variables if the artificial agent-at-large is to be endowed with even a sub-insect capacity for sensing and locomotion. Moreover. Given an environment of incrementally increasing complexity. Brooks (1991). Dreyfus (1992). and Agre (1997). as well as the various philosophical assumptions on which those foundations are built.

as each new object requires that a new representation be added. but with Descartes. as since the advent of the Church-Turing thesis (Church 1932.” Computationalism is the term that I will use.” and “computationalism (Dietrich 1990. Leibniz. But even this notion of computation—the originary notion of computer science—is itself already grounded in an older notion. Turing 1936) computation has largely been conceived as the algorithmically codifiable manipulation of symbols. the mechanistic explanation of the 17th century. Hobbes. to the agent’s model of the world. The breakaway AI researchers would be arguing. not only with the accepted wisdom of the field. where those symbols stand in for objects and operations in the world.unexpectedly. by an engineer. 7 The first viable alternative to the symbolic representation approach is outlined in Rodney Brooks' "Intelligence without representation" and "New Approaches to Robotics (Brooks 1991. 1991). then. that has variously been labeled “mentalism (Agre 1995.7 This would be no simple task.6 It was precisely these kinds of problems that prompted a small faction of AI researchers to question the very principle of symbolic representation. conventional AI. see the introduction to Andy Clark's Being There (Clark 1997). Scheutz 2002). 1997)." 25 . 1936.” “the computational metaphor (Stein 1999). They would be arguing against the guiding rubric of computer science. 6 For an interesting overview of the various problems posed by the symbolic representation approach in AI. namely. Locke and Newton. and socalled “hard” cognitive science.

It’s such a view that provided the original impetus of AI research. and as the failings of AI would bear out.At the heart of the computationalist perspective is the presumption that we reason about the world through mechanized procedure. It’s beyond the scope of the present study to enter into what remains a major debate in the philosophy of mind and cognitive science over the mechanistic foundations of thought. to see in the mechanical procedure a simulacrum of human thought. and has led to what Agre has termed “a dynamic of mutual reinforcement … between the technology of computation and the Cartesian view of human nature. I will however argue that the tacit acceptance of the computationalist approach will prove to be a stumbling block in the design of computer interfaces for musical performance. therefore.e. consists in extrapolating data from the world.” Essentially. and the successes of computer science can make it rather easy to anthropomorphize the process of computation. the way in which we program computers to simulate real-world problems and dynamics.” then the computer-as-itcomes—a materialization of the computationalist paradigm—already precludes 26 . as Andy Clark has noted. the computationalist rubric would have it that computation is synonymous with cognition. i. If. Mental activity. and reasoning about the representational domain that those abstractions comprise.. This is. coding abstractions from that data. with computational processes inside computers corresponding to thought processes inside minds (Agre 1997: 2). through the deductive manipulation of symbols that stand in for objects and operations in the world. by and large. in much the same way that it has already proven to be a stumbling block in the design of artificial agents. “symbol manipulation is a disembodied activity (Clark 1997: 4).

or a “representational bottleneck” (Brooks 1991). aural and tactile perception. There is. such conceptualizations come about as a result of an abstract inner space. Each of us is a container. whether human or artificial. The agent is. the elements of visual. actions. coding abstractions and reasoning about a world that forever remains exterior to cognitive process. and events. But what can be seen in the computationalist model of representation is a fundamental objectivism in which the reasoning of the agent. is situated above and outside the environmental embedding of the agent’s body. In other words. the container metaphor extends to various ways in which we conceptualize time and space. an essential dualism at the heart of the computationalist model of cognition. a kind of transcendental controller. one that sets an “inside” against an “outside. the agent performs manipulations on symbolic representations of the task domain in a realm of mental abstraction that is always and necessarily disconnected from the environmental niche in which activity actually takes place. With the current state of knowledge about the workings of the nervous system. But it’s a specific variety of dualism. then.” It corresponds to a manner of thinking about the world that George Lakoff and Mark Johnson have identified as the container metaphor: We are physical beings. In the discourse of computationalism. bounded and set off from the rest of the world by the surface of our skins.” setting itself in contradistinction to both the body—which 27 . and we experience the rest of the world as outside us. activities and states. for the human agent. with a bounding surface and an in-out orientation.the possibility of embodied forms of interaction. therefore. the “mind. there is no precise way of determining whether this is a matter of a symbolic overload. (Lakoff and Johnson 1980: 29) From this ontological grounding.

” When those plans result in behavior. using one’s world models to simulate the consequences of possible actions (Agre 2002: 132). The “bounding surface” of the mind is traversed by sensory stimuli.is viewed as little more than a transducer of sensory experience—and the outside world. these representations— along with the representations of structure that establish their logical connections. then. and also in the sequential model of activity that it presumes. been altered. as well as between mind and body. And over the iterative chain that would characterize extended activity—a chain of actions following decisions 28 . “by searching through a space of potential future sequences of events. There is an inevitable delay. the agent reaches the end of the sequence of events that characterizes the “in-out orientation” of the mind. Implicit to the container metaphor is the assumption that cognition is fundamentally distinct from perceiving and acting. a kind of propositional calculus—are stored as the contents of the mind. between decision and action. and the relationship between agent and world has. The pertinence of the container metaphor to the present study lies in the strong separation it enforces between agent and environment. these stimuli are converted into representations of the world-as-perceived. and these contents form the basis from which plans are constructed. The container metaphor is consistent with the mechanistic explanation. in some way or other. and that mind and matter—in the tradition of the Cartesian res cognitans and res extensa—are necessarily separate. It’s precisely because of the schism between thinking and acting that activity is sequential—the agent must form an internal represention of the domain and construct a plan before deciding on appropriate action.

and. that few people would think to question the kind of user knowledge on which it draws.following actions—a sequence of such delays punctuates the flow of activity. But on examination. etc. the user puts things “in” the trash. clicking. 29 . it reveals itself to be an instance of the container metaphor. which are in turn containers for files. through an “in-out orientation” to an environment populated by well-defined objects. to the way in which it galvanizes the user’s knowledge of the world. The graphical interface paradigm is nowadays so pervasive. the user comes to encounter the virtual environment in much the same way as the Cartesian subject encounters the world. The representational domain is functionally isomorphic with the Cartesian model of mind. double-clicking. The great success of the WIMP (window. menu. as least in part. dropping.—to accomplish the tasks required by the activity domain. the workspace is a container for folders. and—through actions such as dragging. this so-called “direct manipulation” style of interaction draws explicitly on the user’s capacity to identify symbolic representations of data (files) and processes (programs). and so on. by virtue of the interface. icon. This is a point that I will explore more fully in the next section. pointing device) model is due. and applicable across the widest range of known and as-yet-unknown task environments. the interaction paradigm would need to be both immediately intuitive to the broadest possible range of human subjects. “opens” a file or program. but in terms of the accessibility of computing machinery to non-specialists. The transition from textual to graphical modes of interaction with computers brought with it significant implications not only in terms of how humans and computers interact. and so obviously effective. In order to make computers more accessible.

But these are not ordinarily the types of activities in which the agent’s optimal bodily experience.If we consider these encounters with the virtual environment in light of the constitutive role that the interface plays in determining user activity. activities in which it would make sense to have an objective cognizance of the contents of the task domain. the kind of representations that serve as the access points to the medium. When those representations stand in for the states of a task domain. By definition. they bring the user to conceive of the task domain. over a history of interactions. and to act within it. An objectivist model of representational content. or the sense of flow. however. we can discern that the interface. and the world models that those representations embody. If. we are considering the suitability of the computer-as-it-comes as a musical instrument that would allow for embodied modes of interaction. All is (most likely) well for the maintenance of a spreadsheet. or for uploading files to a server. constitute an important matter for consideration. therefore throws up a not inconsiderable 30 . The objects of the virtual environment provide the locus for interaction. have significant bearing on the effective accomplishment of the task at hand. in such a way that the focus is directed at changing or manipulating those states. I take no position on the suitability of objectivist forms of representation to everyday or mundane computational tasks. The model of the world as embodied in the interface will effectively lead to its own realization. will bring about the “disconnect” between agent and environment that is implicit to the container metaphor. an enactive model of performance would situate the agent’s cognitive acitivities entirely within her environment. which would situate the agent’s cognitive activities outside her environment. and the user retains the status of detached controller.

even if the form of representation is the physical embodiment of the computing device itself. In terms of the magnitude of representational abstraction.media.8 The crucial point. 8 This is precisely the representational strategy behind tangible computing. it is the difference between those forms of representations that set out to passively encode the state of the task domain. then.mit. however. 2006). More specifically. is the form of representation. This is a point to which I will return throughout the essay. and those that would seek to structure the agent’s active involvement within the task domain. But there can be no practicable form of interaction with a computer without an interface. the user of a tangible device manages to put the idea that she is interacting with a computer out of mind. tangible interfaces are of a very low order. and an interface requires that the computational activity be represented in some form or other. If. accessed July 25. it’s worth examining in closer detail the costs to performance of unwittingly adopting the objectivist/computationalist model of representation that is ingrained in the methods of conventional CS and HCI. for example.edu. First. her cognizance of the interface is of the same order of abstraction as the Gibsonian affordance ("this chair affords sitting"). 31 . For numerous examples of tangible user interface devices see the website of the Tangible Media Group at MIT (http://tangible.obstacle to arriving at embodied modes of interaction.

disappearance is an indicator of the moment at which the tool user ceases to experience the tool as separate from her body.4 Sensing and Acting A movement is learned when the body has understood it. and indeed situates her in a specific and highly determined relation to the medium. Maurice Merleau-Ponty. for the experience to be that of direct manipulation of the interface contents. In short. that is. The Phenomenology of Perception Although the “disconnect” between agent and environment is intrinsic to the container metaphor as applied to the computationalist model of mind. A state of immersion in the task for which the tool is required leads to the 32 . the container metaphor does not in itself account for how we experience or perceive a disconnect in. which is made upon it independently of any representation. for example. 9 Disappearance is an important concept in Heidegger's philosophy of technology. when it has incorporated it into its ‘world. particulary in Being and Time (Heidegger [1927] 1962). it is to allow oneself to respond to their call. it’s nonetheless entirely possible for that user to become seemingly immersed in the task environment.2.’ and to move one’s body is to aim at things through it. And despite the ways in which the WIMP interface regulates the activities of the user. and for the medium to effectively disappear9 from use. musical performance with a laptop computer.

and it would seem that there’s more to the issue than drawing a tidy distinction between the virtual and the real. the tool disappears as an object of consciousness. and as such. Immersive activity involving the computer-as-it-comes is therefore substantively different to immersive activity involving. this is still a superficial treatment of a very subtle process. it fulfills one of the five key criteria of embodied activity that I outlined in Chapter 1. provide a guarantee of an embodied mode of interaction. however. as in the WIMP model—involves situating the agent’s attention and intentions squarely within that virtual world. the environment constituted by iconic abstractions of computational data and tasks. experiencing the tool as an extension of her body. or between abstract and direct modes of user. the disconnect with the real world is proportional to the amount of attention consumed by the objects that populate the virtual world. 33 . The context of embodiment. a situated activity. On closer inspection.Immersion in the task environment does not. in the midst of activity. It’s likely that an immersive activity is engaging.g. by definition. however. but it’s that very immersion that determines that the activity is not embodied. As a consequence. To that extent.. It would even seem to follow that an immersive activity is. this would not seem to be the case in the specific example of interaction with the computer-as-it-comes.e.. Immersion in a virtual environment—e. say. In itself. i. the environment within which the agent’s sense of embodiment arises. a conventional acoustic musical instrument. The agent is immersed in the activity. is the real world.

the user will come to think and act in terms of the objects that populate a world exterior to cognitive process. and the sensorimotor habits and patterns that are engendered by those modes of transmission. menu and pointing device—play a determining role in the formation of objectivist concepts in the computer user’s cognizance of activity.interaction. and in doing so diminishes the potential for involvement of the other sensory and motor modalities.. i. clicking. there’s also the how of the container metaphor’s instantiation within the WIMP model. more specifically. The key consideration here is the modes for the transmission of signals from computer to user. the hand works in tandem with the eyes to move the mouse cursor towards that icon. it is the sensory and motor mechanisms that are called into use. But it’s not simply a matter that because the interactional domain is an instance of the container metaphor. When the cursor and icon converge 34 . there is only a single and discrete centre of interaction.e. the weight of emphasis on visual forms of representation consumes a large portion of the user’s attention. icon. As the gaze is directed towards an icon of interest to the task. the mouse or text cursor. and 2. and from user back to computer. or. This is where the issues of timeliness and multimodality—another two of the five key criteria of embodied activity—enter the picture. at any given moment. There are two key facets to the WIMP model that guarantee that interactions with the computer-as-it-comes can never be multimodal: 1. I’ve already discussed the ways in which the core elements of the WIMP model of interaction—the window. typing and observing. These are the two aspects of a mode of interaction—the typical mode of interaction with the computer-as-it-comes—that are experienced by the user as an on-going sequence of pointing.

or press keys on the keyboard.10 With acoustic instrumental performance. the fingers click on the mouse button. it’s the various ways in which these modalities work together and exert influence upon one another. There is no concurrency of actions.on the screen. and no potential for the cross-coupling of distinct input channels. 10 For a comparative analysis of "time-multiplexed" (single point) vs. the flow of time is effectively segmented into discrete chunks. is better adapted to meet the real-time constraints of performance. The single point of interaction that is characteristic of the WIMP model of interaction leads to a mode of activity that is characterized by a sequential chain of discrete user gestures. it’s not just the concurrent use of multiple sensorimotor modalities that leads to the sense of embodiment." 35 . where any action can be taken only after the prior action has been completed. The immediate cost of the visuocentric approach to the non-visual sensorimotor modalities is self-evident: the more cognitive resources are allocated to vision. as a function of the ongoing accrual of competence at coordinating the sensorimotor assemblage. "space- multiplexed" (multiple. But there is another aspect that is perhaps less obvious. and this is where the mode of interaction coincides with the issue of timeliness. to elicit a response from the on-screen abstraction. the fewer remain for the agent’s other sensors and actuators. distributed points) interaction scenarios. see Fitzmaurice and Buxton's "An empirical evaluation of graspable user interfaces (Fitzmaurice and Buxton 1997). and the way in which the performer. no possibility of operating at two or more interactional nodes simultaneously.

might more properly be defined as “embodied agency. The WIMP model of interaction presumes that the user has plans “in mind. where those constraints encompass the necessity of a timely response. Rather.” But whatever the designation. Agency. and act. It may be interesting to consider if there may be potential misuses of the computer-as-it-comes that could lead to embodied interactional modes. let’s assume that the computer-as-it-comes is. it may be useful to draw a distinction between planning and agency. in this case.With regard to this notion of timeliness.” and that these plans are to be executed. is indicative of behavior that is adaptive to environmental demands and constraints. an off-the-shelf laptop computer. a sequence that the enactive approach would seek to reverse. 36 . By “misuse. infer. it points to a mode of performance that is bluntly precluded by the representational infrastructure of the WIMP paradigm. That infrastructure presumes a model of reality in which the contents of the world come prior to our behavioral engagement with the world. on the other hand. step-by-step. The system of abstractions and representations that typify the WIMP model are not geared to the demands of real time. until the objective of the task-at-hand is met. building on a model of behavior in which reasoning about representations formed from sense impressions must always take place prior to action.” I mean a kind of usage that in one way or another does not correspond to the usage scenarios presumed by the WIMP paradigm. in this specific sense. each step towards accomplishing the plan will simply take as long as it takes to sense. As “laptop music” has already figured in the discussion. Agency.

My third finger will stop when the menu item “Save” is highlighted. I’ll move my second finger to the trackpad. But with different mappings from the 11 These devices vary from one model of laptop computer to the next.g. I’ll use my pointer finger to click the trackpad button.11 According to conventional WIMP practice. As I type these words (at my generic laptop computer). 37 . and again I’ll use my pointer finger to click the trackpad button. a mouse cursor will appear on screen. A menu will appear. The structure of the interface determines that my motions will follow a type-point-click sequence. As I’ve argued. each item will be highlighted in turn. Upon contact. indicating the point at which the next character in this sequence of discrete characters is anticipated. inputs at these devices are coordinated by the position of the cursor on the computer screen. For the purposes of this example. and that at each step in that sequence my attention will be directed towards the single point of interaction that the interface affords. and as I use my third finger to move the cursor over its contents. many laptops substitute a trackpoint for a trackpad. where prior experience tells me I will find the word “File. the text cursor blinks at the current text position.. and the number and type of trackpad buttons may also vary. When I’m done typing. My finger will guide the mouse cursor to a point at the top-left region of the screen. this kind of determination on the part of the interface will preclude embodied modes of interaction.The standard input devices of the generic laptop computer are the keyboard and trackpad. I will assume a trackpad with a single trackpad button.” When the cursor is over that word. e.

g. or as simple as the mapping from piano keyboard to hammer and string (one sound event per key event). I think. Interestingly enough. We see that the keyboard does in fact afford multiple points of interaction. an affordance that was not apparent when trackpad usage was bound up with the task of directing the cursor to discrete points on the computer screen. and that these points might be engaged concurrently. Either way. mappings that would subvert the inherent sequentiality of WIMP—the interface acquires new affordances. We also see that the trackpad affords continuous input with two degrees of freedom. Mappings from keyboard events to software could be arbitrarily complex. is done away with altogether. Mappings from trackpad input to software could afford the continuous modification of the sound events thus 38 . it’s in doing away with the cursor that entirely new interactional possibilities for the keyboard and trackpad become apparent. then. That is.input devices to programs—e.. an affordance that the blinking text cursor—along with the accumulated usage history of QWERTY technologies— had somehow hidden from view. the formation of composite events from distributed points of interaction. The cursor. Suppose that some piece of sound synthesis software is written. the interface affords chording. the computer screen could be entirely dimmed. To minimize unnecessary distractions in performance. These misuses of the keyboard and trackpad would seem to circumvent the impediments to embodied activity that characterize the WIMP paradigm: singularity and sequentiality. it solicits new modes of activity from the user. and that it’s written expressly to be used without graphical or textual feedback from the computer screen. perhaps. Could this amount to an interface that affords embodied modes of interaction? The short answer is.

chording actions would be performed by the dominant hand. of an embodied performance practice. At the same time. that we substitute a new map for the WIMP map. because of the fine granularity of action required of keyboard input. What we have changed. This is not an insurmountable task. What’s interesting about this example is that we have not changed the physical structure of the interface. to any regular user of a laptop computer. is the potential for interaction that the interface affords. however. have the beginnings of an expressive instrument. perhaps. we continue to use the same keyboard and trackpad that serve as the input devices in the WIMP model. we might turn the base of the laptop at a 30-45° angle to the standard typing position. 39 . especially given that users of general purpose computers 12 The role of bimanual asymmetry in interface design is discussed in 4.12 To situate the hands in optimal position. even.4. We would almost certainly push the (blank) screen to as flat a position as possible. And they would need to be learned in spite of the activities the laptop has previously afforded in everyday use. these new affordances would need to be learned.e. The asymmetry of “handedness” would likely determine that.triggered by the keyboard. and the example shows that these affordances are immanent to the map from input devices to programs. We may. we construct a new model of performance. then. and it’s in the continuity of these modifications that the inherent sequentiality of pointing and clicking would be circumvented. then. i. while continuous modificatory actions at the trackpad would be performed by the nondominant hand.. Of course. to put it out of the way of the hands.

limited degree. Or. of the trackpad’s proximity to the keyboard. And this possibility provides enough incentive to turn attention towards the design of special purpose devices. there seems a reasonable possibility that the instrument will not be engaging over a sustained period of practice. and so on. accustomed to learning new patterns of interaction with each new piece of software.are. it may simply be an issue of the instrument’s failure to be properly indicative of use (a topic I will discuss in Chapter 4). I’ve been concerned in this section with outlining the ways in which the standard interaction model of the computer-as-it-comes precludes embodied activity. Whatever the explanation. That is. of the arrangement of keys not being conducive to chording. But when I suggested that this reconfigured laptop would perhaps afford embodied modes of interaction. One of the hazards of design is the weight of convention on current practice. a force that often goes entirely unnoticed in design practice. This may be an issue of the limited potential for resistance in the keyboard’s pushbutton mechanism. and to leave unanswered the question as to whether this general purpose device might. and while it’s entirely feasible that the performer could develop a timely and multimodal mode of interaction with this new interface. I did so out of a hesitation as regards the physical structure of the interface. afford embodied modes of interaction. there nonetheless remains some physical property of the interface that would seem to be opposed to the development of an embodied performance practice. It seems to me that it’s this very force—and the widespread failure to notice it—that has led 40 . to a certain. of the limited surface area of the trackpad. under certain circumstance. while the affordances of the interface have been fundamentally altered by new mappings from hardware to software.

will need to arrive at an alternative interactional paradigm to that of the computer-as-it-comes. 2000). embodied agent-based model of interaction. Difference and Repetition Andrew Feenberg draws a distinction between a “primary” and a “secondary instrumentalization. the core difference between the primary and the secondary instrumentalization lies in the way that the task 13 In Feenberg's scheme. This something is an object. — Gilles Deleuze. In doing so. these softwares also buy unwittingly into a model of performance that places abstract reasoning prior to action. then.”13 In terms of the implementation of interfaces. One of the main objectives of this study is to outline a sketch of one such alternative.” which respectively consist in “the functional constitution of technical objects and subjects. primary and secondary instrumentalization respectively correspond to "essentialist" and "constructivist" orientations of human to medium (Feenberg 1999. An enactive.5 Functional and Realizational Interfaces Something in the world forces us to think. not of recognition.to numerous music softwares that buy unwittingly into the model of interaction that is implicit to the WIMP paradigm.” and “the realization of the constituted objects and subjects in actual technical networks and devices (Feenberg 1999: 202). 2. but of a fundamental encounter. a model that inevitably leads to a disembodied mode of interaction. 41 .

of redetermining the relationship between technical objects and their human subjects. as transparently as possible. brings with it the possibility of continuously realizing new encounters and uses. on the other hand. be static. and. The realizational domain encompasses the contexts of meaning and signification in which human and medium are embedded. Landing an airplane. the accomplishment of the task. he is nonetheless careful to point out that the primary instrumentalization. the correlation between the system of interface metaphors and the system of real-world objects and operations for which those metaphors stand—should. The welldesigned functional interface conceals the specific mechanics of the task. realization is a form of play. the representational correspondence of the interface to the world—i. it is structured around a finite set of interactions which are known in advance of the task’s execution. In short.e. in the process. and is conducive to dynamic and indeterminate forms of interaction. The functional interface (primary instrumentalization) serves a predetermined function. The realizational interface (secondary instrumentalization). or functionalism. for example. 42 . and presents the user with possibilities for action that draw on familiar and often rehearsed patterns of experience and use. in the interest of maximizing the potential for continued existence. There are a great many task environments in which it makes sense to facilitate.domain is structured. still has it uses. While Feenberg correlates the secondary instrumentalization with a broadly socialist utopian project. presents a situation in which human agency is best served by an immutable function-relation between the elements of the interface and the range of possible outcomes that the interface represents.

14 This situates the user in an interesting position. The cognitive effort is at its optimal minimum when the representations have a directly recognizable corollary in the user’s prior experience of the world. She is immersed in what would appear to be the im-mediacy of the task. To that end. it takes on an artificial transparency through its very leveraging of the user’s experience. but the medium is still very much present. In terms of meeting the various constraints and demands of the task environment.Efficiency is key to the functionalist approach. the ideal functionalist interface would have the user convinced that it consists of no representations at all. At that moment the task is conflated with the metaphorical domain in which it is represented. and the interface effectively disappears in use. Indeed. the representations cease to be denotative. the user comes to conceptualize the task directly in terms of what is represented. as the task environment obtains its coherence through the system of representations that comprise the interface. and continues to be constitutive of the structural relation of technical object and human subject. and it’s of no use to involve her in forms of play. the well-designed functionalist interface is comprised of representations that are immediately familiar to the user. the tool becomes "equipment" at the moment of its disappearance in use. That is. it’s of no use to have the user waste time on the parsing of a complex metaphorical system. 43 . And while the interface is evidently not at all 14 In Heidegger's terminology. Functionalism aims to minimize the cognitive load. and instead become the intrinsic elements of the task itself. it becomes equipment.

It minimizes the cognitive demand and. and Hutchins 1986) 15—the model in which the user drags graphical representations of files into graphical representations of folders. and is not conducive to dynamic and indeterminate forms of interaction. Functionalism has become a standard metric in the evaluation of the successes and shortcomings of computer interfaces. then. 44 . In leveraging the user’s experience of the world. the interface directs her towards a set of predetermined expectations as regards performance. In contradistinction to the domain of realization. among other things—already has the aim of the usage enterprise built 15 For an implementation guide to the "direct manipulation" model of computer interface design. the functionalist domain does not encompass the contexts of meaning and signification in which human and medium are embedded. defines an interactional context in which significance—at least ideally—is invariable. 1999. the more effectively it corresponds to the ideal of functionalist efficiency. accessed March 20. Holland.html. 2006). The model of computer interface design known as “direct manipulation” (Norman.com/documentation/mac/HIGuidelines/HIGuidelines-2.transparent to the task domain. The idea of leveraging experience in order to minimize the strain that the interface places on the user’s cognitive apparatus is a hallmark of “user-centered design” (Norman 1986. and the extent to which the interface disappears from the user’s attention constitutes the key criteria for the success of such approaches. at the same time.apple. see "The Macintosh Human Interface Guidelines" (http://developer. Norman and Draper 1986). the more it seems to be transparent.

it makes little sense to do so when balancing a computerized bank account or uploading a file to a server.. It’s entirely possible that the functionalist approach is optimally effective across a broad range of routine computational task environments. with functionalism becoming something of a de facto standard in interaction design. things work best when the user believes that.into the blanketing term. we are dealing with such a task environment. in task environments where the task-at-hand is better served by a realizational approach. i.. the realizational approach would suggest that the interface offers some form of resistance to the user. and in which the degree of efficiency with which the task may be accomplished is inversely proportional to the amount of user attention that is consumed by the interface. she is in fact working directly with the objects of the task domain. One of the key aspects of this paradigmatically embodied form of activity is its immediacy.e. These are tasks in which the activity is better served by invariable representations. that it should be irrevocably present. That is. Where Donald Norman and other key figures in “user-centered design” champion the disappearance of the interface. this would seem to be at odds with the notion of flow. In much the same way that it makes little sense to employ dynamic and indeterminate forms of interaction when landing an airplane.e. In thinking about designing interfaces for musical performance. But there is a danger. At first glance. that the functionalist approach is adopted in task environments where it is not well-suited. i. rather than manipulating symbolic abstractions. and it would seem self-evident that the more the 45 .

Over a sustained period of time. the sense of im-mediacy experienced when the agent is immersed in the act of hammering—the sense that the hammer is not a distinct object. And it is in this that the hammer is not a realizational interface. and so on. To return to Francisco Varela’s formulation. the task ceases to present her with cognitive challenges. but an extension of the agent’s sensorimotor mechanism—is indicative of that agent’s embodiment in action. but it is not necessarily engaging. the instrument responds with proportionate energy. the less im-mediate the activity.” The cognitive dimension is central to the process. Hammering may be immediate and immersive. and it is precisely where enaction and realization coincide. say. For example. There is a “push and pull” between musician and instrument. energy that is experienced by the musician as sound. It’s at this point that it’s useful to draw a distinction between embodied action and enaction. haptic resistance.medium obtrudes in use. such cognitive challenges are not prerequisite to embodied activity. weight. how is that. the musician adapts her bodily dispositions to the ways in which the instrument 46 . when the agent successfully responds to cognitive challenges. exactly how is the potential for realization embedded in the the instrumental interface? Or. a violin is substantively different to a hammer? The short answer is in the way in which the musician’s intentionality is coupled to the instrument’s specific and immanent kinds of resistance. and hammering is not enactive. But once the agent has acquired a sufficient degree of performative competence at hammering. This raises an obvious question: if performance with conventional acoustic musical instruments is enactive. enaction involves the “bringing forth of a world. As the musician transmits kinetic energy into the mechanism. While the sense of embodiment may be enhanced. or even optimal.

offers resistance to the agent. Rather. It’s important to note that these adaptations. and the cognitive dimension continues to be central to the process of adaptation.. it presumes a neutrality of the interface to human intentionality. or she will make modifications to the instrument that would better serve that realizational potential.e. At one level. are also determined by the musician’s intentionality. It’s because the musician sets out to realize something—to actively participate in embodied practices of signification—that her adaptation follows a unique trajectory. The hammer. like the violin. and ignores the constitutive role that the interface plays in the emergence of intentional and behavioral patterns. An agent could very well set about developing a musical performance practice with a hammer. But it’s likely that. and instead to view the entire process as a matter of the agent’s intentionality.resists.” But this view does not consider the specific dynamic properties of resistance that are embodied in the interface. then. functionalism would correspond to a “functional attitude. it would seem meaningless to talk of functional and realizational interfaces. as much as they are determined by the resistance offered by the instrument. she will either abandon the instrument for a medium that offers greater potential for realization. i. But this still doesn’t provide a satisfactory explanation of how the potential for realization is somehow embodied in one interface but not another. at some point. to the instrument’s dynamical responsiveness. carefully adapting her bodily dispositions to its dynamic properties of resistance over a period of many years of thoughtful rehearsal. 47 .” and realization would likewise correspond to a “realizational attitude.

the integration of force feedback within the controller apparatus. artifacts and humans—are constituted through an ongoing process of mutual specification and determination. and in the carefully considered mapping of the parameters of those synthesis primitives to tactile controllers. and for the “bringing forth of a world”—is effectively maximized. fluid. In requiring that the musician’s ongoing cognitive involvement is central to the process of adaptation to the instrument’s dynamics. Although music has its obvious functional uses in late capitalist society. it assumes open-ended. and at least partly indeterminate processes of signification. and at the same time sufficiently coherent. considered as interface.. and as such requires the ongoing cognitive involvement of the musician. it is advantageous that the hammer. The hammer has been constituted to serve a largely predetermined functional agenda: hammering. both technical objects and subjects— i. But the 48 . A majority of conventional acoustic musical instruments have been constituted in such a way that the dynamic properties of their resistance are sufficiently complex. presents minimal cognitive demands on the agent. the potential for realization—for embodied forms of signification. the model of musical performance that is of specific interest to the present study is realizational. an interface is constituted that comes close to the realizational potential of the real world instrument that it models. As such. and so on. Approaches to digital musical instrument design that set out to model the dynamics of conventional acoustic instruments by and large circumvent the pitfalls of de facto functionalism.e. that they coincide optimally with the musician’s intentionality.To return to Feenberg’s specification. In the simulation of the various networks of excitors and resonators that constitute the physical mechanisms of acoustic instruments.

and cognition. and cognition. but rather a matter of the various and complex dependencies between action. 2. This is why I have considered it important to distinguish between functional and realizational modes of interaction. And while there is much to be learned through analyzing the dynamical properties of conventional instruments. this means interfaces that embody the prospect of enaction. In the specific case of musical performance. multimodal. the basic idea is nonetheless to arrive at a practice that fully engages the new prospects for performance that are indigenous to computing media. of the inseparability of action. this can only be an impediment to arriving at technologies that maximize the potential for realization.6 Conclusion To reiterate my key criteria from Chapter 1: embodied activity is situated. The discourse of functionalism is implicit to the discourse of conventional humancomputer interaction design. Or. This calls for an alternative discourse.main focus of this study is to outline a foundation for the design of digital musical instruments that is more general than the physical modeling of existing instruments. perception. 49 . The sense of embodiment over a history of interactions within a phenomenal domain emerges at the point where these various constraints intersect. and engaging. in a purely enactivist sense. This is not just a matter of action. perception. timely. and alternative approaches to design. As I have attempted to show.

then. which would aim to reduce the cognitive load on the agent and make the interface disappear from use. When all or most of these criteria fail to be met. we will need to systematically rethink the world models that are embedded in the interface to the computer-as-it-comes. The WIMP model serves to enforce this separation. no possibility for the kind of interactive and circular processes of emergence that are characteristic of enaction. The objectivist foundations of conventional HCI presume a strong separation between user and device. 50 . and situate the user squarely “outside” the interactional domain. This will be my task for the remainder of the essay. Further. and then looking at the ways in which such a model could be materialized in an instrument that would necessarily be something other than the-computer-as-it-comes.What I have set out to show in this chapter is the various ways in which the computer-as-it-comes is a far from ideal medium in terms of meeting the criteria of embodied activity. where the locus of interaction is almost invariably unimodal. the predominant notion of human-computer interaction design. presumes a model of activity that is anything but engaging or challenging to the agent. in my view. Overcoming the disconnect that the computer-as-it-comes enforces between human and instrument will require elaborating an alternative world model. there is. To arrive at an enactive model of musical interaction. or embodied cognition. and at the same time to regulate the actions of the user in such a way that time is discretized into repeating units of sensing and acting.

1 Two Persistent Dualisms In Chapter 2. when examining an interactional context with a view to enactive process. But in any attempt to describe such interactions. But these hard dividing lines persist in our language. or between body and mind. And so in the same way that we 51 . our descriptions inevitably land squarely at the boundary between agent and environment. particularly when attempting to discern the adaptive process that sees a complex set of ever-more refined skills. The Phenomenology of Perception 3. I also suggested that it makes little sense to discuss agent and environment in isolation. and instead stressed the inseparability of one from the other. and therefore also in any provisional description of the elements and processes of enaction. dispositions and behaviors emerge over a history of interactions. I suggested that it would make little sense. — Maurice Merleau-Ponty. to draw hard dividing lines between action and cognition.3 Enaction The body is our general medium for having a world.

permeated as it is by the inherent dualisms of Western philosophical and scientific discourse. and which are therefore not at all easy to define in dualist.” Disconnection would seem to be the order of the day.” The directness of experience. The specific variety of experience that I’ve set out to describe—this paradigmatically embodied. then. it’s difficult to describe.insert a hard dividing line between body and mind. It is a variety of experience that comes prior to description. any notion of “direct experience. In other words. and Rosch 1991: 116). and prior to any clear 52 . immersive and engaged experience—is fundamentally about activities that are always in a state of becoming. But while philosophical language may be geared in such a way that describing experience necessarily involves dualist. it involves “the processual transformation of the past into the future through the intermediary of transitional forms that in themselves have no permanent substance (Varela. As long as the body is opposed to both mind and world. abstract and objectivist terms. Or rather. and in which new systems and structures continuously emerge and disappear in the midst of interactional unfolding. On the face of it. it will lead us to two disconnects: between mind and body. will ultimately lead us back to a primary disconnect. it would seem that our language. Thompson. resides in the “nowness” of the experiential present. we tacitly delineate a neat separation between body and world. it does not necessarily follow that “direct experience”—however that may be defined—does not factor among those varieties of human experience for which we may or may not already have an adequate terminology. much less defend. This presents a problem. and between body and world. Enaction involves a temporality in which relations are constantly in flux. abstract and objectivist terms.

determination of the subject, or of those objects and opportunities for action that make up that (transitional) subject’s environment. In attempting to define “direct experience,” then, we encounter a paradox. Direct experience implies a provisional and temporary state of being that is always and necessarily resistant to ontological reduction. I would even go so far as to say that the “nowness” of the lived present is that which makes direct experience, by definition, preontological. But as soon as we attempt to describe the systems and structures of direct experience, we introduce ontological categories. It’s in this that we see the intrinsic paradox of the description: there can be no notion of that which is direct without casting experience in abstract terms. This is likely to be the source of some confusion. And given that one of the primary motivations behind the present study is to outline a philosophical foundation for design, it will not help if the key philosophical concepts are poorly defined or potentially misleading. Fortunately, questions such as these are not without precedent; there is a branch of philosophy that has dealt systematically with direct experience, and it has done so within the context of a well-defined dualist discourse. In the transcendental phenomenology of Husserl,1 the existential phenomenology of Heidegger and Merleau-Ponty, and in the latter day

1

Although Husserl does not figure very significantly in this study, I mention him

because he is acknowledged as the founding figure of European phenomenology, and had a direct influence on the thinking of both Heidegger and Merleau-Ponty.

53

reworking of both European and Buddhist phenomenology2 in enactive cognitive science and so-called postphenomenology,3 the apparent paradox of a dualistic description of unreflective behaviour is dealt with comprehensively. Phenomenology, in its various manifestations, is a vast and complex field, and it’s beyond the scope of this essay to cover any of its myriad branches of inquiry in any significant manner. However, there are two key concepts, from two quite different moments in the phenomenological tradition, which are particularly useful to the model of interaction that I am attempting to describe. Double embodiment and structural coupling—both of which terms already point to a fundamental dualism prior to their elaboration—respectively address the mind/body and body/world problems in direct experience. In outlining them here, I hope to clear up any confusion as to how the dualism that resides in any description of embodied action is substantively different from the disembodied dualism that lies at the heart of the computationalist perspective. This should bring us to a point where, after having established a disconnect in our descriptions, we come to see how that disconnect ceases to exist in the flux of

2

The philosophy of Nagarjuna, for example, and of the Madhyamika tradition in

Buddhist thought, figures significantly in Varela, Rosch and Thompson's outline of "codependent arising," and its implications for subjectivity (Varela, Thompson, and Rosch 1991).
3

Postphenomenology is a term introduced by, and most often associated with,

philosopher Don Ihde (Ihde 1983, 1990, 1991, 1993, 2002).

54

embodied action, and in the experiential merging of self and world. I should note that I am not attempting to construct a new theory of the mind/body problem here, or even to weigh into the debate. Rather, the objective is pragmatic: to outline some core theoretical issues with a view to opening up a space for new digital musical instrument design scenarios.

3.2

Double Embodiment

As long as the body is defined in terms of existence in-itself, it functions uniformly like a mechanism, and as long as the mind is defined in terms of pure existence for-itself, it knows only objects arrayed before it. — Maurice Merleau-Ponty, The Phenomenology of Perception

In his analysis of tool use in Being and Time (Heidegger [1927] 1962), Heidegger draws a famous distinction between the ready-to-hand and the present-at-hand. The ready-to-hand indicates an essentially pragmatic relation between user and tool. It is when the tool disappears, i.e., when it has the status of equipment, that the user engages the task environment via the ready-to-hand. The relation, then, is not about a human subject and an “object” of perception. Rather, it is about that object’s “withdrawal” into the experiential unity of the actional context:
The peculiarity of what is proximally ready-to-hand is that, it must, as it were, withdraw in order to be ready-to-hand quite authentically. That with which our

55

(Heidegger [1927] 1962: 99) The ready-to-hand implies an engaged and embodied flow of activity. or.” only “if it breaks or slips from grasp or mars the wood.” It’s only when this flow of activity is disturbed by some kind of technological breakdown that the apparently seamless continuity between user and tool is broken. (Heidegger [1927] 1962: 102) The hammer appears as an object of consciousness. and enables us to see the obstinacy of that with which we must concern ourselves in the first instance before we do anything else. was undertaken: 56 . the hammer is invisibly folded into the continuum of direct experience.e. that with which we concern ourselves primarily is the work. the presence-at-hand of the ready-to-hand makes itself known in a new way as the Being of that which lies before us and calls for our attending to it. The human is caught up in what Hubert Dreyfus has called “absorbed coping (Dreyfus 1993: 27). then. The moment of its acquiring the status of object coincides with a disturbance to the accomplishment of the purpose for which the activity. present-at-hand: Anything which is un-ready-to-hand … is disturbing to us. In the moment of breaking down the tool becomes un-ready-to-hand.everyday dealings proximally dwell is not the tools themselves. in the first instance. With this obstinacy.. it acquires “hammerness. It has no objectness in itself. On the contrary.” Prior to the technological breakdown. in Heidegger’s more often used term. i. or if there is a nail to be driven and the hammer cannot be found (Winograd and Flores 1986: 36). but rather disappears into the purposefulness of action.

and Rosch 1991). Thompson and Rosch's The Embodied Mind (Varela. who in turn base their coinage on Merleau-Ponty's notion of embodiment: We hold with Merleau-Ponty that Western scientific culture requires that we see our bodies both as physical structures and as lived." biological and phenomenological. between direct and abstract modes of engaging the world. or a double embodiment. immediate experience is supplanted by abstract and reflective experience when the tool user is necessitated by a breakdown to perceive the tool in abstract terms.When an assignment has been disturbed—when something is unusable for some purpose—then the assignment becomes explicit. we continuously circulate back and forth between them.4 4 I borrow the term "double embodiment" from Varela. That both modes are experienced by the same body points to a fundamental duality of embodied experience. Thompson. There is a back-andforth in experience. These two sides of embodiment are obviously not opposed. Instead. Merleau-Ponty recognized that we cannot understand this circulation without a detailed investigation of its fundamental axis. experiential structures—in short. then. and to reflect on the context in which action and intention is embedded.” That is. (Heidegger [1927] 1962: 105) Hubert Dreyfus recasts Heidegger’s distinction between the ready-to-hand and the present-at-hand in psychological terms. as both "outer" and "inner. He suggests that it is only when purposeful activity is disturbed that “a conscious subject with self-referential mental states directed toward determinate objects with properties gradually emerges (Dreyfus 1991: 71). direct. 57 .

it would seem contradictory to speak of abstract reflection as a subset of embodied experience. it encompasses it within the lived experience of the doubly embodied agent at large in the world. Second. An enactive model of cognition does not. It does not. and reasoning about potential courses of action. the body must necessarily “contain” cognition. and experience.” an idea that. With double embodiment. Rather. the embodiment of knowledge. for example. all activity is mediated by internal representations of the task domain. To the extent that abstract reflection forms part of lived experience—at the moment of a technological breakdown. First. and Rosch 1991:xv-xvi) 58 . dismiss the reflective state of disembodied reason. distinction. Thompson.At first glance. then. Further. abstract reflection would seem to be more or less identical in function to the disembodied reasoning of the computationalist model of cognition that I outlined in Chapter 2. for example—the experience of disembodiment is quite literally embodied by the reflective subject. This seemingly paradoxical state of affairs is captured in Merleau-Ponty’s concept of the “practical cogito (Merleau-Ponty [1945] 2004). the computationalist model of cognition does not account for unreflective experience. satisfy the criteria of embodied activity that I laid out in Chapter 1. such a state of affairs arises only when the flow of unreflective activity is interrupted. cognition. According to the computationalist perspective. and inherently paradoxical. in a namely. by locating cognitive process entirely within the mechanisms of the body as lived. There are two critical points here in arriving at a fairly subtle. (Varela.

Instead. but the contradiction disappears…if we operate in time. For Merleau-Ponty. and of her body’s objective relations to the objects arrayed before it. (Merleau-Ponty [1945] 2004: 330) Embodied being. encompasses both direct action and abstract reflection. with the substitution of practical understanding for abstract understanding. have termed “a fundamental circularity (Varela. through what Varela et al. the phenomenological project is in the first instance concerned with reversing the Cartesian axiom. actional and cognitive skills develop. as for Heidegger. Thompson. it is through this circulating back and forth. Enaction does admit a mind/body dualism. And in the unfolding of being that conforms to the enactive model of cognition. and Rosch 1991: xv). Thompson.” that perceptual. and Rosch 1991). is only ever 59 . experiential structure and the body as the context or milieu of cognitive mechanisms (Varela. then. encompasses both reflective and unreflective experience.single turn of phrase. and if we manage to understand time as the measure of being.” Indeed.” The crucial factor in addressing the apparent contradiction between direct action and abstract reflection is to situate both within the context of the unfolding of activity and cognitive skill in a temporal context: There is. as long as we operate within being. indeed. hand in hand. “These two sides of embodiment are obviously not opposed. Thompson. and with the placement of an “I can” prior to the “I think (Merleau-Ponty [1945] 2004: 137).” But the moment in which the agent becomes subjectively conscious of her body. then: it “encompasses both the body as a lived. and Rosch 1991: xvi). we continuously circulate back and forth between them (Varela. a contradiction.

timely. it keeps the user in a state of disconnection from the tool. It does not allow for a motility that is situated.5 While some authors have suggested that we should explicitly factor the Heideggerean breakdown into our music interface models (Di Scipio 1997. I suggest. the computer-as-it-comes precludes embodied forms of activity. it should not be discounted. they also place emphasis on non-real-time music production (composition). the body recedes into the background. 1999). rather.transitory. attention should be 5 Winograd and Flores present an extensive analysis of the conventional metaphors of computer science in relation to a Heideggerean ontology in Understanding Computers and Cognition (Winograd and Flores 1986). Hamman 1997. a disconnect that is reinforced by the symbolic representationalist underpinnings of conventional computer interfaces. and is our natural way of galvanizing tools and working within our everyday environments— is missing from the conventional interactional paradigms with the computer-as-itcomes. and so when turning to design. At the moment that activity resumes. But that form of direct experience that Heidegger termed the ready-to-hand—a notion that is more or less synonymous with the notion of embodied activity that I outlined in Chapter 1. engaging. and its objects withdraw into the immediacy of the task. As I argued in Chapter 2. 60 . What I have endeavored to show here is that this disconnect is a factor in experience. In short. multimodal. that with a view to designing enactive instruments. rather than the processes of real-time music production (performance) with which I am specifically concerned.

In terms of the technical implementation. for example. 3. As with the hammer. and the subject is inseparable from the world. Such breakdowns are essential. but rather at engineering the potential for the desired kind of breakdowns. we can expect that breakdowns will happen in the course of everyday practice.3 Structural Coupling The world is inseparable from the subject. mind/body and body/world dualisms. then. The Phenomenology of Perception Although I’ve already suggested that double embodiment and structural coupling address. but from a world which the subject itself projects.directed at maximizing the potential for fully engaged and direct experience. — Maurice Merleau-Ponty. the measure will be resistance. 61 . to the incremental adaptive process of learning to play a conventional acoustic instrument.5) I outline this adaptive process in detail with specific reference to the role of breakdowns. it would be more accurate to say that both double embodiment and structural coupling address the mind/body/world continuum with an emphasis on different processes. respectively. when turning to issues of design. will not be directed at engineering breakdowns. but from a subject which is nothing but a project of the world. or with any other tool. The world 6 Later in the chapter (3.6 My focus.

the concept of structural coupling was applied to evolutionary biology. In early formulations (Maturana and Varela 1980. the emphasis in structural coupling is on the circular processes of causation and specification that pertain between the agent and the environment. More specifically. and hence the structure of their interactions. The process is captured neatly in Maturana and Varela’s definition of an autopoietic machine: An autopoietic machine is a machine organized (defined as a unity) as a network of processes of production (transformation and destruction) of components that 62 . the mind figures in structural coupling: it is the locus of cognitive emergence over a history of interactions between body and world. with a view to their mutual adaptation and coevolution. 1987). But where the emphasis in double embodiment is on the oscillatory nature of mental engagement in an interactional context. structural coupling draws a dividing line between body and world in description and schematization—i. their respective structures.obviously figures in the double embodiment analysis: it is the context in which action is embedded. More specifically. It presented an analysis of the interactions between an organism and its environment (where the environment may include other organisms). it addressed the circular and reciprocal nature of these interactions. are changed as a function of the exchange. and in the emergence of performative and cognitive patterns and competencies. The coupling between organism and environment is “structural” because. it enforces a separation—in order to demonstrate the inseparability of one from the other in the unfolding of a coextensive interactional milieu.. In much the same way. as the organism and the environment exchange matter and energy.e.

” such that both organism and environment are more viably adapted to productive exchange. The fully developed notion of structural coupling. then. and (ii) constitute it (the machine) as a concrete unity in the space in which they (the components) exist by specifying the topological domain of its realization as such a network. a continuous realization of “the network of processes. Structural coupling is a key component of the enactivist model of cognition. by the agent. (Varela. there is an increasing regularization of structure. or enacted. (Maturana and Varela 1980: 78-79) Over a history of exchanges between organism and environment. Thompson. and such that those exchanges strengthen the conditions for continued interaction. In contrast to the computationalist subject— who reasons about an external world in an internal domain of symbolic representation—the enactive subject actively realizes the world through the connection of the nervous system to the sensory and motor surfaces which. i. and Rosch 1991: 206) The world that is brought forth..e. connect the embodied agent to the environment within the course of action.produce the components which: (i) through their interactions and transformations continuously regenerate and realize the network of processes (relations) that produced them. traverses the divide between agent and environment. in turn. it is the very mechanism by which cognitive properties emerge: Question 1: What is cognition? Answer: Enaction: A history of structural coupling that brings forth a world. emphasizes the 63 . Rosch and Thompson’s formulation. In Varela.

64 . and so play a critical role in the emergence of embodied practices and habits. The dividing line is rather more pliable.inseparability of agent and environment in embodied cognition.1). This push and pull between agent and environment has a dynamic contour. and this is where the “hard dividing line” that we may draw between them must necessarily be qualified. There is a certain push and pull of physical forces between agent and environment that constitutes a critical aspect of their structural coupling. In other words. but at the same time locates the points at which agent and environment intersect. and these forces act upon the agent’s body within the course of activity. Physical constraints also exist within the environment. or finds herself in new or changing environments with new or changing actional priorities. a quality that is tidily encapsulated in a schematization by Hillel Chiel and Randall Beer (Figure 3. structural coupling implies physical constraints and feedback. and offers an explanation as to how repetitive contacts at these points of intersection can lead to incrementally more complex states of functioning on the part of the cognitive system. and it is a constraint that is in an ongoing state of transformation as the agent acquires and develops motor skills. The contingencies and specificities of the agent’s embodiment form one such constraint.

and adaptive behavior emerges from the interactions of all three systems. 65 . Interactions between the nervous system. and the environment are each rich. complicated. and vice versa. There is. In Chiel and Beer’s diagram. are clearly distinct. then. and contains the nervous system. The push and pull between each of the components in the interactional domain is indicated by projecting triangular regions. which in turn is embedded within the environment. the body.1. which are coupled to one another.Figure 3. The nervous system. It’s clear that a “push” on one side of the body-environment divide results in a proportionate “pull” on the other. highly structured dynamical systems. a fluid complementarity between environment. and the environment (from Chiel and Beer (1997)). Chiel and Bier’s commentary: The nervous system (NS) is embedded within a body. the dividing lines between body and environment. which is connected to the sensorimotor surface through the same dynamical “push-pull” patterns that connect the body to the environment. but they are not rigid. The “body” consists of sensory inputs and motor outputs. and between nervous system and body. the body (sensorimotor surfaces).

We would then see the projecting triangular regions extend and contract in regular (though not necessarily periodic) oscillatory patterns. nonlinear. To capture the properly dynamical nature of this complementarity. 66 .body. and nervous system. I will return to this point in my outline of implementational models in Chapter 4. of which Chiel and Beer’s diagram provides an instantaneous snapshot. the dynamical systems approach provides a potentially useful way of both understanding and schematizing structural coupling. 1997). may exhibit linear. that. It’s a perspective that has also been adopted by a handful of cognitive scientists as an explanatory mechanism for the emergence of cognitive structures through interactional dynamics (Hutchins 1995. These kinds of exchanges may be more or less stable in terms of the impact of environmental dynamics on agent dynamics. from one interaction to the next. And they may demand more or less of the agent’s cognitive resources. the diagram would need to be animated. and these motions would provide a view of the continuous balancing of energies between agent and environment as the play of physically constrained action unfolds over time. Beer has suggested that when embodied agent and environment are coupled through interaction. depending on the potential complexity of balancing the intentionality of the agent with the environmental contingencies. What we see is a transfer function—a map—from agent to environment and back again. Thelen 1994). and vice versa. or even random behavior. Although it doesn’t form an explicit part of Varela and Maturana’s original formulation. they form a nonautonomous dynamical system (Beer 1996.

and 2. more or less. This lands us. on the other. to emphasize the inseparability of agent and environment. to locate the points at which agent and environment intersect.There are two fundamental and seemingly contradictory points to viewing interactions between an embodied agent and its environment as a process of structural coupling: 1. i. the agent. and to understand their respective behaviors as self-contained properties of autonomous systems.e. rather than being lived through a world of abstract inner contemplation. We will see a disconnect in schematizations of both the computationalist and the enactive models of action. and by a form of experience that. is lived directly at the points where the sensorimotor system coincides with the environment in which it is embedded. the world. their bounding surfaces. Therefore. it’s precisely the point at which the mechanics of the agentenvironment connection need to be described. it’s rather easy to view them in isolation. On one side. The agent does not feel herself to be separate from the world in which she is acting but. But what distinguishes the enactive model from the computationalist model is the formation of a larger unity between agent and world through dynamical processes of embodied interaction and adaptation. These processes are characterized by crossings of the divide. The danger with the analytic part of this formulation is that. back within the computationalist model of rationally guided action. by the “push and pull” between coupled physical systems. as soon as we’ve drawn the dividing line between agent and environment. rather. Although we can delineate the boundary between agent and environment in an abstract diagram of their interactional milieu. 67 . such a diagram will not capture the experiential aspect of embodied interaction.

and an essential criterion in design. when the five criteria of embodied activity (Chapter 1) are met. As Thelen and Smith point out (Thelen 1994). and vice versa. as a matter of definition. a virtual body with its phenomenal ‘place’ defined by its task and situation (Merleau-Ponty [1945] 2004: 250). as a thing in objective space. The “bringing forth of a world. 68 . of an organismic continuity between agent and environment.” but rather constitutes “a system of possible actions. amounts to the moment at which the original severance. Structural coupling between performer and instrument will. or disconnect.is intimately folded into its dynamics and processes.7 To this extent (and in keeping with Varela. and the fifth--embodiment is an emergent phenomenon-would come for free. a structurally coupled system is inevitably formed. the emergence of cognitive. structural coupling implies enaction. ceases to factor in the agent’s experience. the first four criteria of embodied activity would form a structurally coupled system. perceptual and actional abilities constitute the teleological dimension of structural coupling. 7 To be more precise. The body is not “as it in fact is.” I would argue that.” that is. Thompson and Rosch’s formulation). be key to the model of enactive musical performance that I am proposing. therefore.

To that end. The leap from theory to implementation is almost always a shaky endeavor. then.4 Towards an Enactive Model of Interaction The key theoretical components of the essay have now been presented. with the intention of holding this model in view when shifting the focus to implementation.3.. 69 . The underlying rationale. is to arrive at a candidate model of enactive interaction. and the models that I present here may serve as a provisional and necessarily speculative bridging of the gap between theory and praxis. it may prove useful to outline the various models of interaction that I’ve discussed to this point in the form of diagrams. the computer transmits output signals representing the state of its programs which are perceived by the human.2) that can be taken to hold for all subsequent models.e. It’s the dynamics of the various models of interaction between human and computer that form the key concern. with a view to distinguishing their various implications for the development of human cognition and action. In turn. There is a basic model of human-computer interaction (figure 3. the diagrams focus specifically on human-computer interaction. but remain both general and nonspecific in terms of hardware and software implementation details (i. But before turning to issues of the design and implementation of enactive digital musical instruments. the interface). The human performs actions at the inputs to the computer which cause changes to the state of the computer’s programs.

HUMAN PERCEPTION COMPUTER OUTPUT S PROGRAMS ACTION M INPUT Figure 3. The basic model is. the usefulness of the model lies solely in specifying the basic mechanics of human-computer interaction. The human perceives and acts. the input and output devices constitute the interface to the programs running on the computer. the model does not account for it. 70 . and as these mechanics can be assumed to be unchanging for all subsequent models. S represents the map from the state of the computer’s output devices to the human’s sensory inputs. But there is nothing to link perception to action.2.8 it can also be assumed 8 To say that the basic mechanics is unchanging is not to say that the interfaces will be identical. The basic model of the human-computer interaction loop. The basic mechanics can be taken to mean the maps from output to perception. In fact. however. and therefore demonstrates intentionality. That is. Together. although a cognitive dimension is implied. and M represents the map from the human’s motor activities to the state of the computer’s input devices. incomplete.

The basic model extended to include the model of human activity in conventional HCI. resulting in a sequential chain of actions. and these dynamics will in turn carry different sets of implications for cognition. For present purposes. Human actions follow after inner reasoning about sensory inputs.3).that the subsequent models will be distinguished solely by cognitive considerations. To make the step from the basic model to the conventional model of humancomputer interaction. HUMAN PERCEPTION COMPUTER OUTPUT S REASONING PROGRAMS ACTION M INPUT Figure 3. perception and action.3. we need only insert human reasoning between perceiving and acting (figure 3. this means the map between perception and action. 71 . and a and from action to input. Different interfaces will result in different map dynamics.

” 72 . input devices are ordinarily monomodal and geared to a single focal point of motor activity (from one moment to the next. and it’s interesting to note the upside-down symmetry on either side of the human/computer divide.” “Programs.” respectively as “Input. the duration of which is simply as long as it takes to perform the necessary mental computations. and output devices are ordinarily visuocentric and geared to a single focal display point (the cursor).” and “Action.” “Reasoning. we have what I have termed “the computer-as-it-comes. There is therefore an inevitable time delay between perception and action.segmentation of the flow of time (see Chapter 2. This model is paradigmatic of what I have termed “the computer-as-it-comes. it can be assumed that the input and output devices of conventional HCI serve to reinforce the computationalist ontology from which conventional HCI derives.4).” We now have a schematization of the Cartesian subject in the midst of interaction. 9 In the spirit of mechanistic philosophy.9 The conventional model presumes that the human reasons about her interactions with the computer in an inner world of mental abstraction. To this extent. Although they are not detailed in figure 3.” It can effectively be guaranteed that interactions with the computer-as-it-comes will be disembodied. either the mouse or the keyboard). When these factors combine in the form of a device.” and “Output. we could even relabel “Perception.3. at least according to the minimal criteria I set down for embodied activity in Chapter 1.

I drew a distinction between functional and realizational interfaces. knowledge is “accessed. the added dimension is labelled “Knowledge. It can also be assumed that there are no real-time constraints on the accessing of this knowledge. and that this aspect reinforces the sense. In schematizing the respective interactional paradigms of the functional and realizational interfaces. In the diagram of the functional model of interaction (figure 3.” rather than “constituted. while the computer side has remained unchanged. then. I have added a further cognitive dimension to the human side of the computer-as-it-comes model. and while it directly informs the ways in which the human subject perceives and reasons. it’s an abstract quantity that exists prior to interactions with the computer. the immediate concern lies with the implications of the interface for the emergence of cognitive. perceptual and actional patterns. that the knowledge being galvanized is offline. in user experience. The distinction rests on the manner in which the interface elicits particular varieties of action and thought from the human user.In Chapter 2.4). 73 . While the terminology places explicit emphasis on the interface and how it is constituted. That is.” This knowledge can be considered offline with regard to activity.” within the course of action.

the goal of the taskat-hand is known in advance. While the approach has a great many advantages for routine activities with computers.. it is not advantageous to activities that are dynamic or nondeterministic by nature. and thereby minimizing the cognitive load.HUMAN KNOWLEDGE COMPUTER PERCEPTION S OUTPUT REASONING PROGRAMS ACTION M INPUT Figure 3.5). I noted in Chapter 2 that functionalism is something of a standard in conventional interaction design. and the interface is designed to lead to the accomplishment of this goal while placing minimal cognitive demands on the human. The human-computer interaction loop with the functional interface (see 2.e. The human’s knowledge is leveraged by the abstractions that comprise the computer’s interface. Through leveraging existing user knowledge.4. The functional interface is deterministic. i. and this knowledge is galvanized to guide perception and reasoning. the task domain and its end goals are made as transparent as possible. leading to appropriate action. 74 .

” are now bidirectional.” and “Realization” and “Reasoning. Because the term “knowledge” implies a fixed state of knowing. The realizational interface is nondeterministic. it is substituted in the diagram by the more dynamic and fluid “realization. “Knowledge” is relabelled “Realization.e.5.. The key difference between the realizational and the functional interface lies in the cognitive demands they place on the human. i.5.In figure 3.” and the links between “Realization” and “Perception. The human-computer interaction loop with the realizational interface (see 2.” 75 . and human knowledge continues to expand over a history of interactions. Whereas human knowledge can be considered static in functional interactions.5). HUMAN REALIZATION COMPUTER PERCEPTION S OUTPUT REASONING PROGRAMS ACTION M INPUT Figure 3. it brings with it a continuing potential for new encounters and uses. it is dynamic in realizational interactions.

on the other hand. an important step has been taken towards the enactive model. reasoning and action are collapsed. By introducing resistance to the interface—a resistance that requires the human to fully engage in the activity—the shift is effected from a static and deterministic model of activity to one that is dynamic and nondeterministic. experienced as flow.” 76 . According to the criteria of embodied activity. offers resistance to the user. it still requires that the human commit continuous and significant cognitive resources to the task.The term “knowledge” implies a static corpus of known facts. deliberately prompting her to new modes of thinking about the task domain. Hence the substitution of the more dynamic and fluid term “realization. In figure 3.6. and thus opens the possibility for the on-going generation of new meanings and modes of thought. It’s precisely this corpus of “knowns” on which the functional interface draws.5. and the continuity between perceiving and acting is indicated by the label “Perceptually Guided Action. That is to say. the model represents a disembodied mode of interaction.” In figure 3. While realization is offline to the activity. then. a reasoning stage still intervenes between the perceiving and acting stages. The realizational interface. I have defined embodied activity as a state of being that consists in a merging of action and awareness. the boundaries between perception. there is a seamless continuity between perceiving and acting. Nonetheless.

and that this has proven a major stumbling block in arriving at designs for digital musical instruments that allow for embodied modes of interaction. The model of activity corresponds to Heidegger’s ready-to-hand. and to Heidegger’s ready-to-hand (see 3. and it can be assumed that the experience of “oneness” involves the loss of any sense of disconnect with the computer. and the sense of disconnect between human and computer ceases to factor in experience.” This corresponds to the flow of embodied activity (see 1.2). “absorbed coping.2). As with 77 .” or.6.HUMAN S PERCEPTUALLY GUIDED ACTION COMPUTER OUTPUT PROGRAMS M INPUT Figure 3. Perception and action constitute a unity. I’ve argued that such a mode of activity is precluded by the computer-as-it-comes. Embodied Interaction. there is a merging of action and awareness. embodied action. in the rubric that I’ve used throughout the essay. The perceiving/reasoning/acting sequence has been collapsed into a fully integrated model of activity. labelled here as “Perceptually Guided Action. or in Hubert Dreyfus’ paraphrase. This is the first of the schematizations in which the human is represented as a unity.

Enaction implies an embodied model of interaction with 78 . that while the sense of embodiment may be optimal when cognitive challenges are placed upon the human agent. there is no explicit focus on conscious mechanisms.7). unreflective mode of behavior.7. In Chapter 2. “Realization” is connected to “Perceptually Guided Action” through a bidirectional path (figure 3. Human and computer are structurally coupled systems (see 3.3). Indeed. To make the step from embodied action to enaction. the distinguishing aspect of the ready-to-hand is that it is an unconscious. prerequisite to enaction. That is. such challenges are not prerequisite to embodiment. I suggested that what distinguishes embodied action from enaction is the realizational dimension. HUMAN REALIZATION COMPUTER S PERCEPTUALLY GUIDED ACTION OUTPUT PROGRAMS M INPUT Figure 3. Cognitive realization is.2).the standard model of human-computer interaction (figure 3. then. Enaction. however.

“representational and non-representational intentionality (Preston 1988). And in both instances. to perception and action. the enactive interface solicits time-constrained improvised responses that are embodied and online. and is therefore disembodied and offline—the enactive interface is concerned with soliciting new responses without recourse to inner representations. cognition is an embodied phenomenon. an activity that necessarily involves reasoning. There’s a symmetry between the enactive model and that of the realizational interface (figure 3. Another way to view this is as the difference between. for the purposes of the present study I assume that inner representations play no part in direct experience. the interface is 10 There are continuing disagreements among cognitive scientists and philosophers of mind as to whether "inner representations" play a part in direct experience. in Elizabeth Preston’s terminology. through reciprocal patterns of determination.e. as this makes it easier to distinguish between direct and abstract experience. In the enactivist view.a view to cognitive and actional realization. Although I take no position in the debate. But where the realizational interface solicits a mode of activity that is disembodied and offline. deliberately causing a reappraisal of the representations that comprise the interface. and to the cognitive challenges this resistance presents.5): both include a realizational dimension that is tied. and in turn shapes the trajectory of future interactions.10 That is. It arises through physical interactions. realization is tightly correlated to the resistance that the interface offers to the human user.” Where the realizational interface is concerned with engineering a representational breakdown—i. If we were to stick with the idea that humans are 79 .

According to Philip Agre. however. it’s important to note that while the enactive model of interaction represents an idealized “way of being” in the performative moment. it does not represent the sum total of the performance practice. we do not relate to an object in terms of its objectness. Rather. in relations to an agent's spatial location. And it is because the object is so directly folded into the actional midst that we encounter it directly rather than abstractly. but I will reiterate here. then. realization is an incremental process of cognitive regularization and awareness. 80 . would also at various moments involve embodied action storing the contents of their environment as inner representations at all times. or current and typical goals or projects (Agre 1997: 242).7). that is. The enactive model of interaction represents the ideal performative outcome of the class of digital musical instruments that I am setting out to define and describe in this study. then we could potentially draw the distinction between abstract and direct experience in terms of objective and deictic intentionality.. then.” that performance practice. but in terms of the role it plays in our activities. and at the same time determining the emergent contour of the body’s unfolding patterns and trajectories. Deictic representations were discussed in Chapter 2. and human activity is embodied and online." With deictic intentionality. in addition to the enactive model of interaction (figure 3.. Before turning to design. social position. in keeping with Merleau-Ponty’s theory of “double embodiment. "a deictic ontology .encountered directly rather than abstractly. can be defined only in indexical and functional terms. In the enactive model. stemming from forces that are directly registered through the body. in real time and real space.

she becomes more finely adapted to the instrumental dynamics. it cannot be assumed that the instrument will provide endless novelty to the performer. it’s not unusual for these experiential modalities to be engaged simultaneously. and offline realization (figure 3. and the human performer would routinely cross the lines that distinguish one modality from the next. however. and becomes readyto-hand.(figure 3. i. With the greater portion of available cognitive resources allocated to the instrumental breakdown. There will be “breakdowns. Additionally.. over the course of practice. Each of these modalities would constitute different ways of engaging the same instrument.6). it will be encountered through a representational intentionality. The instrument will become present-at-hand. it’s likely that the act of playing proceeds without a great deal of reflective thought. she continues playing on the remaining three strings. in the midst of embodied activity. to borrow terminology from Heidegger—the instrument effectively disappears from use.e. We see then a coincidence of the present-at-hand and the ready-to-hand. particularly as. At the same time. drawing the focus of her attention to the objectness of the instrument.5). which shift awareness to the “objectness” of the instrument. In everyday embodied practices. It is also something that happens as a matter of course in the 81 . as the intentionality of the performer is divided across different components of the same instrument. a violinist breaks a string in the middle of performance. multi-tasking.” particularly in the learning stage. At such moments—again. doubly embodied performer. That the same human is able to divide the instantaneous allocation of cognitive resources into representational and nonrepresentational subcomponents is nothing extraordinary for a practiced. For example.

5 The Discontinuous Unfolding of Skill Acquisition In Merleau-Ponty’s phenomenology.” The model encompasses the interdependencies between perception. and therefore need not factor in design. 3. which I have termed “enactive performance practice (figure 3. human intentionality is fundamentally concerned with the body’s manner of relating to objects in the course of purposive activity. then. In the particular case of what I have termed enactive digital musical instruments.development of any form of embodied practice. then.8). and offline realization models into a single integrated model. embodied action. the other modalities—embodied action and offline realization—will invariably follow.” then. In the broadest sense of the term. is that the enactive model is the only one that need be kept in view. it can be assumed that if the instrumental implementation engenders suitable conditions for the enactive model of interaction. 82 . action and cognitive unfolding within the circumscribed interactional domain of instrumental practice. The practical implication for instrument design. we can condense the enactive. it encompasses both representational and nonrepresentational intentional modes. In using the umbrella term “intentionality.

and these perceptions.COGNITION R Time HUMAN BODY INSTRUMENT I Figure 3.8. Human body and instrument are unities. and of the dynamics of the bodyinstrument interactions. while R represents the map from the instrument’s reactions back to the human. it does serve to encapsulate all the key facets of the interaction paradigm I’ve set out to describe. and cognitive abilities emerge over time through the continuous and embodied circular interactions between them. but we get closer to 83 . Enactive performance practice. exemplifying an intentionality. The instrumental reactions are perceived by the human. The process could be schematized as a bidirectional exchange. I represents the map from human intentionality to the instrument. Her bodily actions are transduced by the instrument and lead to a reaction. modulate her intentionality. there is an incremental regularization of the performative patterns of the body. and thus her ongoing reactions and bodily dispositions. The human acts purposefully through her body. as they are registered in the body. As these cognitive abilities develop. While the enactive performance practice model is too general to be useful in design.

but rather is folded into them through realization.8. the sensorimotor surfaces. of course. Over time. etc. As long as enactive performance practice—and also the intentional arc—can be said to encompass representational and nonrepresentational modes. 84 . Although cognition and the body are indicated as distinct entities in figure 3. the environment may include any manner of physical spaces. cognitive abilities continue to develop. actional and cognitive skills. as it implies a continuity in the acquisition of perceptual. however. as the body continues to adapt to the dynamics of the interactional domain. the environment can be taken to comprise the instrument. In real practice. other animals. a continuity that is also implied in the unbroken trajectory of cognitive unfolding in figure 3. realized at the connections between the nervous system.11 Enactive performance practice as I’ve outlined it here is consistent with Merleau-Ponty’s notion of the intentional arc (see Chapter 2). The “arc” metaphor is interesting. it's beyond the scope of this study to factor them into consideration. The cognitive dimension is not independent of these interactions. It should be kept in mind that cognition is an embodied phenomenon. this is solely for the purposes of clarity. At the same time. this doesn’t present a problem. the model does not accurately reflect the ways in which the modes of bodily relation to an instrument are transformed over the course of cognitive 11 In this essay. humans. and the environment.the flux of the performance experience if the interactions are viewed as circular and continuous. and an idealized physical space in which the instrument's outputs might be optimally perceived by the human performer. While such features of the environment will inevitably play a part in the emergence and formation of performer intentionality.8.

Dreyfus sets out in his article to “lay out more fully than Merleau-Ponty does. Dreyfus assumes “the case of an adult acquiring a skill by instruction (Dreyfus 1996:6). I’ll do this by drawing out some correspondences between two texts. how our relation to the world is transformed as we acquire a skill (Dreyfus 1996:6).” and “Expertise”—where each stage is characterized by specific bodily ways of relating to the task environment in question. This is an especially important point when considering the acquisition of realizational skills. then.unfolding.” “Advanced beginner.” and David Sudnow’s Ways of the Hand (Sudnow 2001). it’s worth considering the ways in which human bodily ways of being are transformed within the process of acquiring a specific skill. but will illustrate the argument with an 12 The numbering system in citations of Dreyfus' article refer to the paragraph number of the online text.” “Proficient.”12 He does this by dividing the temporal unfolding of skill acquisition into five distinct stages—“Novice. Before moving on to issues of implementation. and learning to play chess. 85 . Hubert Dreyfus’ “The Current Relevance of Merleau-Ponty’s Phenomenology of Embodiment (Dreyfus 1996). it does not account for the intrinsically discontinuous back-andforth between the present-at-hand and the ready-to-hand that characterizes the acquisition of skill. In the discussion that follows.” and illustrates his argument with two examples: learning to drive a car.e. I will borrow from Dreyfus’ decomposition of the intentional arc into five distinct stages. i.” “Competence. such as learning to play a musical instrument.

time-constrained performance. (Sudnow 2001:12) The proper “place” of the chords was determined by the specific configuration of piano keys that the hand would need to engage. (Dreyfus 1996:7) That the features of the environment are “context-free” implies that the focus of activity is directed towards connecting the body to the instrument—i. and the proper alignments were the voicing of those chords: In early lessons with my new teacher the topic was chord construction. establishing a “grip”—in the proper place and with the proper alignments..example that is more immediately pertinent to the present study: learning to improvise with a musical instrument. the instruction process begins with the instructor decomposing the task environment into context-free features which the beginner can recognize without benefit of experience in the task domain. Dreyfus’ “Novice” stage begins with the reduction of the task environment into explicit representations of the elements of which the environment is composed: Normally. or voicing. For Sudnow. It’s interesting to note the 86 . playing a chord’s tones in nicely distributed ways. like a computer following a program. the features of the task environment were chords. The beginner is then given rules for determining actions on the basis of these features. Sudnow’s Ways of the Hand—a detailed first person “production account” of the gradual acquisition of skill as a jazz pianist—is in this regard an ideal candidate.e. but without any explicit regard as to how these alignments will eventually fold into the context of embodied.

as well as to the 87 . groping to put each finger into a good spot. then—the initial “context-free” feature of the environment—is itself decomposed into individual features. and. then look back to the keyboard—only to find the visual and manual hold hadn’t yet been well established. Each note of the chord is mentally associated with an individual finger before the hand gains a hold on the chord as a whole. the student learns to recognize them. After seeing a sufficient number of examples.” In Dreyfus’ taxonomy. Instructional maxims now can refer to these new situational aspects. “lots of searching and looking are first required (Sudnow 2001:12). or an instructor points out. arranging the individual fingers a bit to find a way for the hand to feel comfortable. build it up from the scratch of its broken parts. perspicuous examples of meaningful additional aspects of the situation. he begins to note. I had to take up the chord again in terms of its constitution. I’d let it go. having gained a hold on the chord. the “Advanced beginner” stage is characterized by the emergence of a degree of contextual recognition: As the novice gains experience actually coping with real situations.“substantial initial awkwardness” that Sudnow describes in the complex of lookings and graspings that characterize this stage: I would find a particular chord. And this decomposition demands an on-going coordination between an abstract mental image of the task at hand and the accomplishment of the task. (Sudnow 2001:12) The mode of engagement here is clearly that of the present-at-hand. The chord. getting a good grasp. As Sudnow notes. find the individual notes again. recognized on the basis of experience.

It is at the next stage of skill acquisition that such factors enter the equation. The perceptual recognition of places and alignments is beginning to occur at a higher level of scale. but this recognition is neither situated (in the sense that one place and alignment might lead to a next place and alignment. or that it might be solicited by some other pressing constraint in the environment. that such gestalts remain limited to isolated and non-time-pressured events. the scope of my looking correspondingly grasped the chord as a whole. where the stage as a whole is characterized by a gradually increasing capacity for dealing with the 88 . i. Dreyfus’ designation for the third stage of skill acquistion—“Competence”—is potentially misleading. (Sudnow 2001:13) It’s important to note. but as a single. however.e. The context that the performer is beginning to glimpse. integrated motion of the hands: As my hands began to form constellations. It would perhaps be more accurate to say that competence emerges towards the end of the third stage. then. (Dreyfus 1996:10) The “situational aspects” here point to an initial emergence of gestalts. or both) nor timely (in the sense that the transition from one place and alignment to a next must satisfy timing constraints in the broader context of a performance). of the tendency to regard coordinated actions—such as the playing of a chord—not as the combined motions of individual figures. seeing not its note-for-noteness but its configuration against the broader visual field of the terrain.objectively defined non-situational features recognizable by the novice. remains offline.

(Dreyfus 1996:13) Interestingly enough. performance becomes nerve-wracking and exhausting. This frustration is borne specifically of the body’s inability to adequately respond to the seemingly overwhelming online demands of performance: With more experience. there obtained the most alienative relations. since a sense of what is important in any particular situation is missing.e.e. Between the chordchanging beat of my left hand at more or less regular intervals according to the chart. with no humor in the situation. however. The beginning of the third stage is marked. and the rather more smoothly managed and securely pulsing background of the bass player and drummer. for I was up there trying to do this jazz I’d practiced nearly all day. for situated and timely musical utterances. It’s worth quoting his account in full: The music wasn’t mine. there were friends I’d invited to join me. by anything but a sense of performative competence. (Sudnow 2001:33) 89 . i. its online aspects—leads to a sense of frustration. At this point. the number of potentially relevant elements of a real-world situation that the learner is able to recognize becomes overwhelming. the melodic movements of the right. I was in the midst of a music the way a lost newcomer finds himself suddenly in the midst of a Mexico City traffic circle.. situated in the midst of these surrounding affairs. It was going on all around me.online aspects of performance. and the student might wonder how anybody ever masters the skill. the disparity between the level of skill accomplished thus far and a newly gained understanding of the larger context of performance—i. and the musicians I’d begun to know. Rather. I was on a bucking bronco of my own body’s doings. Sudnow’s first public performance took place at precisely this stage in his development.

the task is again reduced to individual components. nuanced. The problem is that there are a vast number of different situations that the learner may encounter. There are. ways. in fact. have to decide for themselves what plan to choose without being sure that it will be appropriate in the particular situation. therefore. the components of the “Competence” stage are rather more contextbound: The competent performer thus seeks new rules and reasoning procedures to decide upon a plan or perspective. more situations than can be named or precisely defined so no one can prepare for the learner a list of what to do in each possible situation. Competent performers.” It also led to Sudnow shying away from further public performances for a period of several years. But unlike the concrete components of activity that constitute the “context-free” features of the “Novice” stage. the plan was to work towards a “melodic intentionality” by extending in practice his acquired embodied knowledge of isolated chords to patterned sequences of chords. Dreyfus notes that the performer normally responds to the newly discovered enormity of the task at hand by adopting a “hierarchical perspective.” In short.The gap between motor intentionality and motor ability led to a music that “was literally out of hand (Sudnow 2001:35). (Dreyfus 1996:15) For Sudnow. many differing from each other in subtle.” and by deciding upon a route that “determines which elements of the situation are to be treated as important and which ones can be ignored (Dreyfus 1996:14). as well as sequences comprised of the individual 90 . But these rules are not as easily come by as the rules given beginners in texts and lectures.

this was a largely conceptual process. (Sudnow 2001:43) The emergence of these gestalts is more or less equivalent to what Sudnow describes as “the emergence of a melodic intentionality”: . (Sudnow 2001:43) And in due course. now again. was dependent in my experience upon the acquisition of facilities that made it possible. rather than appearing solely at the level of the event: A small sequence of notes was played.” then “diminished on the third and a repeat for the next. gestalts began to emerge at the level of the sequence. and for some time. Motivated so predominantly toward the rapid course. Though not yet a native speaker of 91 .” doing hosts of calculating and guidance operations of this sort in the course of play.. frustrated in my attempts to reproduce recorded passages. this plan was decided upon without input from his teacher. and it wasn’t as though in my prior work I had been trying and failing to make coherent note-to-note melodies. or guidance from “texts and lectures”: At first.. It’s precisley in this emerging capacity to form fully articulated phrases that the performer achieves a degree of competence. then a next followed. an express aiming for sounds. I’d think: “major triad on the second note of the scale. As the abilities of my hand developed. The simplest sorts of melody-making entailed a note-to-note intentionality that had been extraordinarily deemphasized by virtue of the isolated ways in which I’d been learning. I had left dormant whatever skills for melodic construction I may have had.notes that those chords contain. I found myself for the first time coming into position to begin to do such melodic work with respect to these courses. Not coincidentally.

Dreyfus’ chracterization of the “Proficient” stage is particularly interesting in terms of the Heideggerean opposition between the present-at-hand and the ready-to-hand: Suppose that events are experienced with involvement as the learner practices his skill. (Dreyfus 1996:20) These “situational discriminations” of “intuitive behavior” point explicitly to the mode of “absorbed coping” that is definitive of the ready-to-hand.e. as the result of both positive and negative experiences. by a calculative procedure. the performer’s theory of the skill. there is also an increase in the ratio of ready-to-hand to present-at-hand modes of engagement: As the brain of the performer acquires the ability to discriminate between a variety of situations entered into with concern and involvement. With an increase in embodied skill. as represented by rules and principles will gradually be replaced by situational discriminations accompanied by associated responses.the language. and that. experience is assimilated in this atheoretical way and intuitive behavior replaces reasoned responses. Action becomes easier and less stressful as the learner simply sees what needs to be achieved rather than deciding. plans are intuitively evoked and certain aspects stand out as important without the learner standing back and choosing those plans or deciding to adopt that perspective. it is embodied by the experiencing subject.. Should this happen. And it’s precisely in the ready-to-hand that “experience is assimilated”. and only if. responses are either strengthened or inhibited. which of several 92 . Proficiency seems to develop if. then. there is nonetheless a fledgling facility for forming coherent sentences. i.

you try to keep it up. when one first gets the knack of a complex skill like riding a bicycle or skiing. and it disintegrates. and essence of the experience was tasted with a “this is it” feeling. There is less doubt that what one is trying to accomplish is appropriate when the goal is simply obvious rather than the winner of a complex competition. as. the occurrence of breakdowns is directly related to the number and type of skills the performer has not managed to assimilate: 93 . The occurrence of such breakdowns is directly related to the number and type of skills the performer has managed to assimilate in the course of interactions with the environment up to the moment in question. (Dreyfus 1996:21) The “Proficient” stage is. Or.. more specifically. i. In fact. And it’s interesting to note the way in which this can directly conflict with “intuitive behavior”: No sooner did I try to latch onto a piece of good-sounding jazz that would seem just to come out in the midst of my improvisations. then several revolutions of the pedals occur. the bicycle seemed to do the riding by itself. Yet there’s no question but that the hang of it was glimpsed. the present-athand. still comprised of a generous quota of moments characterized by a mode of “detached evaluation”. the bicycle seems to go off on its own. (Sudnow 2001:76) What we see is the paradigmatic Heideggerean “breakdown”. the very first attempt to sustain an easeful management undercuts it.possible alternatives should be selected. the catalyst that effects the shift from a ready-to-hand to a present-at-hand mode of perceiving the task environment. at the moment of involved intuitive response there can be no doubt. since doubt comes only with detached evaluation of performance. You struggle to stay balanced. keep failing.e. than it would be undermined. like a revelation. however.

the proficient performer. sayings now being attempted. A continuity that—in the case of proficiency—is rendered discontinuous by the intrusion of breakdowns. and with the embodiment of ever more refined responses to the dynamical contingencies of the environment. he falls back on detached. seeing the goal and the important features of the situation. 94 . To decide. (Sudnow 2001:56) It’s these “connectives”—“a way of making the best of things continuously (Sudnow 2001:59)”—that gradually fall into place over the course of sustained practice. a discussion or argument.e. Sudnow also uses a linguistic analogy: From a virtual hodgepodge of phonemes and approximate paralinguistics. rule-based determination of actions. from the level of the individual phrase or sentence to the level of. What distinguishes the “Proficient” stage from the “Expertise” stage. (Dreyfus 1996:22) What distinguishes the “Proficient” stage from the “Competent” stage is a shift to a yet higher level of articulational scale. courses of action were being sustained that faded and disintegrated into stammerings and stutterings. is the continuity of the discourse. But at the same time. however. perhaps.The proficient performer simply has not yet had enough experience with the wide variety of possible responses to each of the situations he or she can now discriminate to have rendered the best response automatic. the solicitation of self-conscious thought. connectives yet to become integrally part of the process. and the catalyst of “stammerings and stutterings”—becomes increasingly seldom. the occurrence of breakdowns—i. must still decide what to do. With this falling into place. a sentence structure was slowly taking form. For this reason. themes starting to achieve some cogent management. That is.

This allows the immediate intuitive response to each situation which is characteristic of expertise. each of which share the same decision. Actions are perceptually guided. (Dreyfus 1996:25) The lessons learned from breakdowns during the “Proficient” stage. or tactic. the perfomer is immersed in the activity. have enabled the expert performer to respond to the same conditions from which those breakdowns emerged in a timely and unselfconscious manner. all seen from the same perspective but requiring different tactical decisions. and the “I think” is supplanted by an “I can”: 95 . but also knows how to achieve the goal. based on mature and practiced situational discrimination. then. single action. But Dreyfus also points to a greater refinement to these responses than there is to the variety of responses that are typical during the “Proficient” stage: The expert not only knows what needs to be achieved. with further discrimination among situations all seen as similar with respect to plan or perspective distinguishing those situations requiring one action from those demanding another.I’ve already suggested that a capacity for continuous intuitive interactional response to environmental dynamics is definitive of what Dreyfus describes as the “Expertise” stage. he suggests that discriminating ability and a continuity of response are necessarily linked criteria of expertise: With enough experience with a variety of situations. A more subtle and refined discrimination ability is what distinguishes the expert from the proficient performer. (Dreyfus 1996:25) More specifically. the proficient performer gradually decomposes this class of situations into subclasses.

pacings. Certain right notes played in certain right ways appeared just to get done. though my fingers went to places to which I didn’t feel I’d specifically taken them. unlike others I’d seen. intensities. In light of the apparent discontinuities of skill acquisition. a saying particularly said in all of its detail: its pitches. in suitable performance circumstances. accentings—a saying said just so. in which cognitive unfolding is indicated as continuous over time. 96 . in a little strip of play that’d go by before I got a good look at it.9. seemingly because of something I was doing. In figure 3. I could hear a bit of that language being well spoken. enable the experience of flow. (Sudnow 2001:78) At this point in the discontinuous unfolding of skill acquisition.I’d see a stretch of melody suddenly appear. durations. the temporal dimension is segmented into discrete blocks corresponding to Dreyfus’ five stages of skill acquisition. in fact for the very first time.8. (Sudnow 2001:76) With the refinement of dispositional abilities. there also emerges a parallel refinement of articulational fluency: I could hear it. the performer embodies perceptual. it may be worth revising the diagram of figure 3. actional and cognitive capacities that. could recognize that I’d done a saying in that language.

“Skill” is indicative of cognitive. 97 . Competence 4. where “skill” can be said to encompass cognitive. Novice 2. motor and perceptual skills. motor and perceptual skills. “Skill” replaces “Cognition” in this diagram.1. the diagram of the continuous and circular human/instrument interaction loop is sufficiently general to be applicable at each of the stages.9. It is also indicative of the developing capacity for coordination between all three. Proficient 5. but as it stands. encompassing the discontinuous unfolding of skill acquisition. Advanced beginner 3. I represents the map from human intentionality to the instrument. Expertise SKILL R Time HUMAN BODY INSTRUMENT I Figure 3. A detailed view of enactive performance practice. while R represents the map from the instrument’s reactions back to the human. as well as the capacity for coordination among the three components in both reflective and unreflective behavior. A more accurate model yet might indicate the changing nature of human body/instrument relations over each of the five stages of skill acquisition.

Sudnow’s account in Ways of the Hand is representative of what I have termed an enactive performance practice. But there is nothing particularly extraordinary about the way in which his skills were acquired. Given an able body (and therefore an innate capacity for perception, action and cognition), an intentionality (e.g. to become an improvising jazz pianist, to produce coherent sequences of notes, etc.), and a sufficiently responsive instrument (e.g. a piano), any human subject might follow an analagous course. In Sudnow’s case, these three prerequisites to enactive performance practice came for free. But my argument has been that in the case of performance with digital musical instruments, something fundamental is missing; i.e. a sufficiently responsive instrument. A sufficient responsiveness is synonymous with what I have referred to as resistance. And it’s precisely the kind of resistance that an instrument affords to the intentioned, embodied agent that will determine whether or not that instrument has the kind of immanent potential that would lead to an enactive performance practice. Kinds of instrumental resistance, then, will be a major focus when the discussion turns to issues of implementation in Chapter 4.

3.6

Conclusion

I began this chapter with a discussion of the inevitable paradox in any description of direct experience. The model of enactive performance practice—an attempt at such a description—brings the discussion squarely back to this fundamental, instinctive, and largely unreflective way in which humans, through the agency of

98

their bodies, relate to the world. This raises the question: if unreflective behavior is so fundamental to human experience, why go to the trouble of detailing so many of its particularities? Why not let that which will happen as a matter of course, happen as a matter of course? Both Heidegger and Merleau-Ponty viewed their work as opposed to the mechanistic underpinnings of canonical Western philosophy. In their respective analyses of mundane, everyday, unreflective activity, there is an agenda to replace the Cartesian model of subjectivity with that of the embodied agent at large in the world. I suggested, earlier in the chapter, that a reversal of the Cartesian axiom constitutes the first concern of the phenomenological project. The mechanistic and the phenomenological discourses, then, are fundamentally at odds. And to the extent that technical discourse continues to hinge on the discourse of mechanistic philosophy, it also continues to be resistant to phenomenology. My concern, then, has been with outlining a model of human experience and activity that serves as an alternative to the model routinely adopted by technical designers, i.e. that of the perpetually disembodied Cartesian subject. If it is in fact possible to design and build digital musical instruments that allow for enactive processes to be realized, then we will have done nothing other than arrive right back at the most fundamental form of human agency.

99

4

Implementation

4.1

Kinds of Resistance

There are two key assumptions that underlie the enactive model of interaction: 1. that human activity and behavior has rich, structured dynamics, and 2. that the kinds of resistance that objects offer to humans in the course of activity are key to the on-going dynamical structuring of interactional patterns. In the previous chapter, I was concerned with describing the interactional patterns of an enactive performance practice with a view to the implications of those patterns for cognition. Focus was directed at the dynamics of human activity and behavior. In this chapter, focus is directed at the kinds of resistance that a candidate digital musical instrument might offer to a human performer in the midst of performative activity. The underlying concern, then, shifts from theory to implementation. I have suggested previously in the essay that conventional acoustic instruments, because of the resistance they offer to the performer, serve as useful examples of technical objects that embody the potential for enaction. But in the huge diversity of mechanisms that we see across the range of acoustic
100

instruments, there is a proportionate diversity in kinds of resistance. The physical feedback to the performer that arises in the encounter between bow and string, for example, is of a different kind to that which comes of the projection of breath into a length of tubing. We can assume, then, that in much the same way that the contingencies of human embodiment play a determining role in the dynamical emergence of performative patterns, so too do the contingencies of instrumental embodiment. This makes the task of arriving at a universal template for the design of enactive musical instruments a profoundly complex, if not obviously impractical undertaking. In the various models of interaction that I schematized in the previous chapter, the maps from human motor function to computer input devices, and from computer output devices to human sensory input, are non-specific in terms of the particular sensorimotor mechanisms that are activated in the course of interaction—the models are intended to be as general and universal as possible. But as soon as we move from interaction diagrams to real world implementations, a higher degree of specificity is required. If, for example, a candidate model for an enactive digital musical instrument were to remain general, there would need to be an account of the myriad ways in which human energy might be transduced as signals at the computer inputs. In the context of the present study, rather than attempting to compile a comprehensive catalogue of implementational possibilities, I will focus on one particular real world implementation: a digital musical instrument that also happens to represent my first serious attempt at engaging the essay’s key theoretical issues in the form of an actual device. This device, as with any musical instrument, offers unique kinds
101

direct a significant amount of attention to issues of software. In persisting with the standard division between hardware and software. It is at the level of hardware.of resistance to the performer. But as I pointed out in Chapter 1. when the core concern is how the candidate instrument is resistant to the human performer. programs and output devices. i. It would seem likely. hardware. the potential for reusing those components is increased. with a view to the various ways in which its indigenous and particular kinds of resistance may or may not lend themselves to the development of an enactive performance practice. which 102 . then. I nonetheless hope to make it apparent that in maintaining a loose coupling between hardware and software components. In the pages that follow. The final component of the study sets out. after all.. I will. that the performer actually physically engages the instrument. I hope also to demonstrate the utility of keeping the two layers separate in the design process. must be programmed. digital instruments constitute a special class of musical devices: their sonic behavior is not immanent in their material embodiment. While I shall be discussing just one specific implementation. that the greater portion of attention would be directed towards input and output devices. This is particularly true of software components. therefore. while hardware certainly constitutes more than a passing concern. So. and I will stick with that model here. the dynamical behavior and resistance of the instrument is to a large degree encapsulated in its programs. to detail the instrument’s implementational specifics. This is the model I employed in the interaction diagrams of Chapter 3. Standard human-computer interaction models partition the computer into three distinct layers: input devices.e. but rather.

1 For an interesting counter example to this approach.1 In that case. see Cook (2004). 4. where hardware and software may in a certain variety of cases be inextricable. any one particular software framework brings with it a certain modest degree of generality. we may also see the beginnings of—if not a universal approach to the design of enactive digital instruments—one that is at least suitably general and robust.may at any time in the future need to be integrated into different implementational contexts. Feely: Hardware Overview A device that goes under the name of Mr. 103 .2 Mr. such as a new hardware framework. And to the extent that the framework continues to evolve across distinct implementations. Feely represents my first attempt at the implementation of an enactive digital musical instrument (figure 4.1).

1. Feely.Figure 4. Mr. 104 .

and a specific motherboard/chipset combination was chosen because of its capacity for fanless operation. One of the design goals was to create a silent instrument with no moving parts inside the enclosure. the bow is a distinct physical entity from the body of the violin—but “total integration” is not really the point of an integrated approach. Eight channel audio A/D and D/A hardware.6 kernel. Rather.g.. Feely’s computational nucleus resides on a miniature x86 compatible motherboard. “Encompassing” is used here in its most literal sense: all of the components of which the instrument is comprised—the input devices.Mr. running the Linux 2.” 105 . and the internal circuitry—are encompassed within a single physical entity. with patches applied for low latency audio throughput and for granting scheduling priority to real-time audio threads. Kartadinata notes that total integration is not ubiquitous among conventional acoustic instruments—e. For that reason. the output devices. the operating system resides on flash memory. that is. emphasis is placed on the coherence of the instrument. how the material embodiment affords a performative encounter with a unity. Integration and Instrumentality Sukandar Kartadinata has used the term “integrated electronic instruments” to denote a class of devices characterized by an encompassing approach to their material realization (Kartadinata 2003). MIDI A/D and D/A boards. This is in sharp contrast to the sprawl of individual devices and cables that characterizes “the often lab-like stage setups built around general purpose computers (Kartadinata 2003:180). and power conversion modules are located in the same enclosure as the motherboard.

and 2. off-theshelf computer (with or without an attendant array of peripheral input devices). It may appear redundant to suggest that an instrument should be instrumental. that the instrument should have the feel of a musical instrument. both of which figured in my approach to design. 106 .Integration and coherence of the instrumental embodiment were important factors in the design of Mr. but designed to rest in the lap of the performer. Feely. that the instrument in its material embodiment should be indicative of a specific purpose. and both of which factor in the perceived coherence of the instrument to the performer: 1. From the outset. This is suggestive of two different interpretations. I had in mind that it was of critical importance that the instrument should have an instrumentality. Figure 4. Because of the instrument’s weight. Feely in the playing position.2 shows Mr. it is secured on a stand. but it seemed to me a useful way of distinguishing the project from those in which the instrument comprises a general purpose.

This aspect of the design is tied in—in the most literal sense—with the aim that the instrument should feel like a musical instrument. The playing position ensures that there is constant physical contact between performer and instrument. The control surface is situated at the performer’s centre of gravity. proportional to the amplitude of the human’s motor energy output.2. The sense of the instrument’s physically “being there” is. as weight is transferred from the upper body to the thighs. the contact with the instrumental body is intensified by hand actions at the control surface. Mr.” and this is tied in with the way in which the instrument is indicative of its use.Figure 4. In the act of playing. and it is angled (with respect to the performer) in 107 . Feely in the playing position. But there is another aspect to this “being there. then.

An important aspect. and occupies the focal ground in the field of vision. This means that the instrument is intended to be nothing but a musical instrument. makes itself more than readily apparent.3).” but—to paraphrase Michael Hamman—that it is “so very there” that the opportunity for action. In keeping with Rodney Brooks’ dictum that “the world is its own best model (Brooks 1991). of the instrument’s instrumentality. giving preference to the performer’s perceptions of the sound itself.” I avoided any graphical representations of the sound or its generating mechanisms at the interface. The control surface is partitioned into distinct regions (figure 4. and the cross-coupling of these perceptions with the tactile and visual engagement of the instrument and its input devices. Feely’s interface is different to that of the computer-as-it-comes. and that it therefore need not accommodate the multiple representational paradigms required of a multiplicity of possible usages. buttons and joysticks. which are distinguished by the points in the audio synthesis system 108 . then. It’s not just that the instrument is “there. Control Surface Unlike the computer-as-it-comes—a general purpose device—Mr.” Three classes of input device are used on Mr. The way in which Mr. is equivalent to the “considerable difference between using the real world as a metaphor for interaction and using it as a medium for interaction (Dourish 2001:101). Feely is a special purpose device. is that the interface is devoid of representational abstractions. then.such a way that it presents itself optimally to the hands. for physically engaging the controls. Feely’s control surface: knobs.

however. Display & Patch Control Joysticks Variants Mute Buttons Global Power Volume On/Off Channel Section Global Section Figure 4. It is. worth noting the control surface’s basic partitioning scheme in this section.3.3. the functional layout of the panel is a hardware concern. Feely: Control surface partitioning scheme.to which they are linked. 109 . Although this unavoidably touches on software issues. I will detail the specific functional behaviors and mapping strategies used to connect the input devices to the audio system in 4. Mr.

and three knobs combined with three buttons. These controllers are mapped to a global audio processing network. These buttons are used to switch between pre-stored variants in the synthesis network. the functions of the other three sections are self-explanatory.” and “Variants” in figure 4. all other buttons will be in their off state.” “Joysticks.” “Global Volume. The Joystick Section is comprised of two x-y joysticks. These joysticks are considered freely assignable to any and multiple input points in the discrete synthesis channels or the global processing network.” “Global Section. The four remaining control surface regions—labelled “Channel Section. four would ordinarily be utilized only between periods of performative activity: those labelled “Display & Patch Control. Signals from each of the five discrete synthesis channels are passed as inputs to this processing network. 110 . one of which springs back to its centre position when not in use.” and “Power On/Off” in figure 4. and in certain cases to points in the five discrete channels.3. The Global Section is divided into two subsections. by mapping functions. The Variants Section is comprised of six backlit buttons.Of the eight distinct regions that comprise the control surface. there are respectively mapped to five discrete audio synthesis networks in the software system.” “Mute Buttons.3—indicate the areas in which activity is focused during performance. which respectively comprise nine knobs. When one of these buttons is toggled on. The Channel Section is partitioned into five discrete channels of three knobs and one button each. or by synthesis network topologies. The Display and Patch Control section is described under Visual Display below. These variants may differ by synthesis parameter settings.

then. sufficiently 111 . but rather learns through performing. reinforces the relationship between specific functional regions and specific functional behaviors to both the visual and tactile senses. in fact they should be optimally adaptable. The performer. in itself. how things work in practice. by partitioning the control surface into functional regions. then the instrument is not. as a whole. by employing a static functional structure across different patches—that is. from a base set of functional correspondences. the user quickly adapts to the relationship between a cluster of controls and clusterings of associated behavioral patterns at the instrument’s output. The emphasis is placed on motor memory as opposed to the conscious storing of data. it would seem that the performer has rather a lot to remember during performance. properly or sufficiently indicative of its use.With all these individual input devices and multiple mapping systems. This means that motor patterns do not need to be relearned from scratch from one patch to the next. The layout of the control panel is designed to facilitate this learning process. and the underlying software system is designed in such a way that motor memory should be transferable and adaptable across varying audio subsystem implementations. The physical layout of the control surface. Secondly. This is not. across even radically divergent implementations of the sound generating subsystem. however. then. Firstly. is not required to store a catalogue of controller functions and mappings in conscious memory. The control surface is still. And if the performer is required to store such data in conscious memory. across varying implementations of the underlying audio synthesis networks—the patterning of the instrument’s behavior remains relatively constant.

It does not. except as a notification mechanism in the case of such a breakdown. I discussed the cost to the nonvisual senses of the visuocentric approach to interaction as typified by the computer-as-it-comes. and to monitor data in the case of “breakdowns” (e. It was my aim that the degree of resistance should be neither so minimal that the interface would become quickly transparent to motor memory and activity. but with a view to rendering the interface as free of abstraction as possible. it would remain beyond grasp. CPU overload. therefore.). program exceptions. Audio Display An important aspect of the “feel” of many conventional acoustic instruments is the haptic feedback to the performer from the instrument’s vibrating body as it radiates sonic energy. This is something that I tried to avoid in the design of Mr. electronic 112 . however. which is used to navigate a patch bank between performances. The display is not intended to be used during performance. it may be directed to the guidance of motor activities. Unlike conventional acoustic instruments. even after a significant amount of practice.g. make any demands on the performer’s attention. or so great that. etc. memory errors. not only with a view to minimizing the cognitive demands of visual attention. Visual Display In chapter 2. and to the extent that vision is required for the performance task. Feely. It proved useful.complex and multifaceted as to offer resistance to learning. to integrate a character display with the control surface.

The effect varies with the character of the sound. and the type and number of reflective and absorbtive material in proximity to the loudspeaker. system). This issue was taken into consideration in the design of Mr. the type of floor surface. its frequency and loudness. in the case of the great majority of acoustic instruments. could not practically be integrated with the body of the instrument.instruments require the use of amplifiers and loudspeakers in order to propagate sound in space. but unfortunately. electronic instruments are lacking in the haptic vibrational feedback that is characteristic of their acoustic counterparts. it was outweighed by other constraints: 1. This speaker placement has one other advantage: the location of the point source of the sound—which. the radiation of vibrational energy can be felt through the feet and. Feely.A. in deciding upon an amplifier/loudspeaker system. the torso. by careful positioning of the amplifier/loudspeaker in performance. By placing the amplifier/loudspeaker on the floor. This limited the options among available technologies. as close as is practical to the body of the instrument.g. that the loudspeaker should have a wide radiation pattern. through a P. Except in the case that the amplifier/loudspeaker system is built into the instrumental body. Nonetheless. that the amplifier be powerful enough for the instrument to be used without further amplification (e. because of its size and weight. and resulted in the choice of a combined amplifier/loudspeaker system that. it’s possible to go a certain way towards the “feel” of a conventional instrument. to a lesser extent. The perceptual localisation of the origin of the sound is an important indicator of the 113 . is the instrument’s body—is as close as is practical to the body of the instrument. and 2.

Secondly. Whereas the computer-as-it-comes would situate the user’s attention in a world of metaphorical abstraction and would provide no guarantee of 114 . then. and sequential (as opposed to parallel) mode of interaction that is idiosyncratic to the computer-as-it-comes. Feely offers resistance to the performer without having paid due attention to software.instrument’s phenomenal presence. motocentric rather than visuocentric. Nonetheless. both for the performer. that the hardware interface to Mr. This means that the performer engages an instrument that has a functional coherence to its material embodiment as well as a tangible physical presence in performance. singular (as opposed to distributed). Firstly. and for the emergence of an enactive performance practice. The interface is. and the audience. and to point to some implications for embodiment. it may be useful to recap on the key aspects of the hardware implementation. These factors contribute to the potential for an encounter with the instrument that is engaging (one of the five criteria of embodied activity from Chapter 1). and encompasses multiple distributed points of interaction. the instrument is integrated and instrumental. it also avoids the associated costs of that model for interaction. representation-hungry. At the same time. This stands in contrast to the visuocentric. Feely avoids the interface model of the computer-as-it-comes. fellow performers. then. the instrumental interface affords distributed motor activities without the burden of representational abstractions. Summary It would be premature to evaluate the ways in which Mr.

multimodal..4). we touch on issues of adaptation and cognition. that embodied activity be situated. Feely affords embodied modes of interaction. with the resistance that it offers to the performer. Such issues are tied in with the instrument’s behavior.meeting timing constraints (see 2.e. 115 . As a piece of hardware. encourages the parallel distribution of the activity across distinct sensorimotor modalities (touch and proprioception. Mr. specifically. and the unique dynamical patterning of thought and activity that comes of that resistance. hearing. These factors again correspond to certain of the five criteria of embodiment. When the focus is shifted from the instantaneous aspects of embodied activity to embodiment as an emergent phenomenon. This brings the discussion around to the implementation of the instrument’s sonic behavior in software.. But to get from interaction to realization—i. Feely situates the user’s attention directly within the activity. and timely.e. Mr. vision). to the emergence of an enactive performance practice—the instrument will be required to offer resistance to the performer through the medium of sound. i. and—because of the distributed and multiply parallel nature of the performative mode—offers a reasonable chance that the real-time constraints of musical performance might be met.

audiosynth. 2 http://www. A sample signal flow diagram would look familiar to anybody who has worked with modular synthesis systems (figure 4. it is mature and offers a rich set of built-in features.4). First. As the main focus of my work has been directed at the creation of a system that would allow for dynamical behaviors. SuperCollider Server Architecture The SuperCollider audio synthesis engine passes signals between nodes on a server. 116 . it will be useful to describe the base architecture on which the framework is built. Feely’s software system is written in the SuperCollider programming language. Feely: Software Overview Mr. 2. The implementational possibilities of these extensions to the language will comprise the main focus of this and the next section.4. and 3.3 Mr. however. it is easily extensible with user-defined modules. and plug-ins. it is object-oriented.com. primitives.2 The language was chosen for three main reasons: 1. where those nodes represent instances of user-defined synthesis and processing functions. much of the task of programming has involved the incremental development of a framework—an integrated library of extensions to the language—that augments the base audio synthesis architecture with modules that allow for complex dynamical mappings between system entities.

sending a message.NODES SIGNALS SOUND Figure 4.5. and 2. A node on the synthesis server may contain parameter slots. The values of a parameter slot may be set by sending messages to the node to which the slot belongs.5).4. mapping the slot to the output of a bus. a node that represents an oscillator function may contain slots for frequency. phase and amplitude parameters. For example. or by mapping the parameter slot to the output of a bus (figure 4. 117 . SuperCollider synthesis server: Signal flow. Writing values to a node’s parameter slots by 1. SLOTS MESSAGE BUS Figure 4.

6). or mapped to a parameter slot belonging to any other node (figure 4.6. This flexibility is exploited and extended in the 118 . Bus 2 taps an output signal from a node in the second channel and maps it to a parameter slot of a node in the first channel. from which the signal could be rerouted as an audio signal input to any other node. SuperCollider’s bussing architecture allows for the flexible routing of signals within the synthesis network. Bus 1 taps an output signal from a node in the first channel and routes it to the audio input of a node in the second channel. BUS 1 1 2 BUS 2 Figure 4. Signal routing between parallel synthesis networks using busses. It’s possible.A bus is a virtual placeholder for a signal. to tap an output signal from any node in the synthesis network and route it to a bus. for example.

Mapping Framework The mapping framework that I have developed for Mr. In Mr. and for defining functional mappings between them. the functional transformation of the signal takes place between the bus and the signal’s destination. any signal within the audio synthesis network may be routed to a bus. and rerouted from that bus to any other point in the network.extensions to the language that form the basis of Mr. The mapping layer allows for the flexibility to route the signal at a single bus to multiple destinations with multiple functional mappings (figure 4. Feely’s mapping framework. a function that is applied to the signal such that the signal’s characteristics are transformed between output at the source component and input at the receiver component.7). That is. Feely is primarily concerned with providing a flexible and intuitive mechanism for routing signals between components of the audio synthesis network. The objects that perform these transformations comprise the mapping layer. As I noted in the previous section. The behavior of the instrument as a whole is in large part determined by these functions and their various mappings and routings within the audio synthesis network. A functional mapping can be taken to mean the transfer function from the output of one component to the input of another. The mapping framework consists of a hierarchical library of such functions encapsulated within discrete software objects. Feely’s mapping framework. This is an example of a “one-to-many (Wanderley 2001)” mapping model. 119 .

The mapping framework also allows for the “cross-coupling (Hunt. Wanderley. The signals at two busses are subject to functional transformations (x and y).8). These signals are routed to three different parameter slots. effecting a one-to-many mapping. or “many-to-one (Wanderley 2001)” mappings (figure 4. y and z) between the bus and their respective parameter slot destinations. Each signal is subject to a functional transformation (those transformations denoted here as x. BUS 1 x BUS 2 y Figure 4. and Paradis 2003)” of bus signals. The transformed signals are summed.8. The software objects that perform these transformations comprise the mapping layer.7 The signal at a bus is split into three signals. or 120 .Mapping Layer x BUS y z Figure 4.

When the output of x is mapped into the parameter slot that corresponds to the multiplicand argument of y.9. The output of the dependent function y is then mapped to a parameter slot in an audio synthesis network component. The output of function x is mapped into a parameter slot in function y. The signals at two busses are subject to functional transformations (x and y).” resulting in a mapping from multiple signal sources to a single parameter slot. This is a 121 . The output of function y is mapped to a parameter slot in an audio synthesis network component. Function y might multiply the output value of the signal at BUS 2 by the value of an argument.“cross-coupled. where the output of one functional mapping may be mapped into a parameter slot in another (figure 4. For example. BUS 1 x BUS 2 y Figure 4. Additionally. function x in figure 4. where that argument is set at a parameter slot.9 might scale the output of the signal at BUS 1 into the range [1.10].9). the mapping framework allows for what I have termed “functionparameter” mappings. the signal at BUS 2 is multiplied by the scaled signal at BUS 1.

x ADC BUS y z Figure 4. but it makes clear the kinds of complex interdependencies between system components that “function-parameter” mappings allow. 122 . In this scheme. While all busses in the audio synthesis system are instances of a single class of bus.simple example. depending on the particular input device to which they are connected. and can be routed to any point in the synthesis network. Mr.10). Busses that are placeholders for signal arriving from Mr. Analog signals are read by an analog-to-digital converter (ADC) and written to a bus in the audio synthesis network. however. Feely’s control surface into functional regions. The signal at the bus may be treated as though it were any other signal. Feely’s hardware controls are connected to the audio synthesis network through busses (figure 4. they are nonetheless classified as having either local or global scope. All busses that are placeholders for signals routed from audio signals have global scope.10. The map from hardware to software. and therefore have identical implementations. are accorded either local or global scope. the scope of a bus corresponds to the function of the input device as defined by the partitioning of Mr. Feely’s hardware controls.

Their scope is local.2 L1. i. for example.11). 1 L1. while Global Section controllers are connected to busses that have global scope (figure 4.1 L1. 123 .1-3 and L2. they may only be routed to the corresponding audio synthesis network channels.1-3 are connected to Channel Section controllers on Mr.Channel Section controllers.e. Local and global scope of busses. are connected to busses that have local scope within each of the five discrete audio synthesis network channels..3 2 L2. The output of these audio synthesis channels is summed and sent to a global processing network. Busses L1.11.3 GLOBAL G1 G2 Figure 4.1 L2. 1 and 2. Feely’s control panel.2 L2.

for example. To this point. signals are treated as equivalent whether their origins are external or internal to the system..e. the simple mapping schemes I have illustrated have not demonstrated models of dynamical behavior. and this equivalency of signals implies that all signal flow networks are formed at the same level of structure. The “push-and-pull” of dynamical forces that is key to the instrument’s resistance. The dynamical behavior of the system as a whole would. The scope of these busses is global. They are placeholders for signals that originate both outside and inside the audio synthesis network. is encapsulated in the structure and behavior of a single integrated signal flow network. nonetheless. or to any of the discrete audio synthesis channels. then. i. as a transparency to the source of signals within the system effectively blurs the implementational boundary between human and instrumental behaviors.11 and that of a linear summing mixer is that the bussing architecture in the figure shows the possibility of a flexible routing of controls signals to individual parameter slots in the various mixer channels. It was a deliberate design choice to accord busses this dual role. Feely’s control panel. and therefore represent the points at which human action and internal mechanism coincide.Global busses G1 and G2 are connected to Global Section controllers on Mr. Busses have a special status in the mapping framework. appear to be relatively flat. they may be routed to the global processing network. however. Consider a system. That is to say. between the mapping scheme of figure 4. The only difference. where the outputs from 124 .

is used to regulate the internal behavior of channel 2. after underdoing functional transformation.two discrete audio synthesis networks are routed to global busses. and the continuous outputs of those functions are routed to parameter slots in the discrete channels. and then back to parameter slots within the discrete networks (figure 4. GLOBAL A1 A2 1 2 x y Figure 4.12). Global busses A1 and A2 serve as placeholders for the output signals of channels 1 and 2 . 125 . The output of channel 1. Discrete audio synthesis networks are coupled to form an interacting composite network. and vice versa. x and y. These signals are transformed by functions.12.

the composite network (comprised of the two interacting subnetworks) could be said to be autonomous. internal behavior is nonautonomous. The structure of the network—i.12 as x and y. then. scale the audio signal to an effective range. rather than being summed (as in figure 4. for example.e.e. as it operates. track the signal’s frequency or amplitude characteristics. scale it to an effective range and map the resulting signal to the slot. The way in which the bussed signals act as regulatory mechanisms in the respective synthesis networks is defined by the mapping functions. however. its topology—creates a coupling between the two discrete audio synthesis networks. This presents an interesting design 126 . the possibility for nonlinear dynamical behaviors in the composite (coupled) system. and vice versa.12. The output signals from the two channels. indicated in figure 4. without human intervention. they now form coupled nonautonomous systems. where they had previously formed uncoupled autonomous systems. They might. could be used to regulate one another’s behavior. and at the same time.. map the audio signal unaltered into the parameter slot. Summary From the perspective of either of the discrete networks in figure 4. behavioral patterns are determined in part by signals that originate outside the network. These functions might encapsulate any number of behaviors.. Any of these choices would create the possibility for complex behavioral dependencies between the two synthesis networks. From the perspective of a human observer. i.11). and exhibits behavior. and so on.In this example. the output of channel 1 is routed back to a parameter slot in channel 2.

127 . rather than exhibiting autonomous dynamical behavior. although we could engineer a system that exhibits dynamical behavior without human involvement. the mapping framework allows for the creation of complex interdependencies between system components.. So. embodies the potential for dynamical behavior when coupled to a human performer. I’ll take up this issue by outlining two specific usage examples. call for calibration of the system—a “tuning” of the system’s dynamical responsiveness—when human action enters the equation. And these interdependencies are key to the “push-and-pull” dynamics that define the instrument’s kinds of resistance. In fact. we want those dynamics to emerge in the coupling of the instrument to a human performer. the instrument responds and resists with proportionately rich and varied sonic behavior. this model forms the basis of the first usage example I will outline in the next section. It does.12. but at the same time. structured dynamics. But the question remains as to how one might go about calibrating the system in such a way that it requires human action. the kind of system that is more compelling with a view to enactive performance practice would be one that.e. however. This does not rule out the kind of model encapsulated in figure 4. i. such that when there is a “push-and-pull” of physical forces at the hardware layer. then.problem: we want the instrument to have rich. In summary.

while the second is relatively mature. and to different realizational potentialities. because they point to different kinds of resistance. Feely: Usage Examples Overview In this section I outline two examples of Mr. As in figure 4. The two usage examples are interesting. the output of channel 1 is mapped via a global bus to a parameter slot in channel 2. 128 . to different modes of embodied activity.1. the first model is in an early stage of development. L1. because their differences illustrate the ways in which diverse implementations might highlight distinct facets of a single basic concern: enactive performance practice.13 departs from that of figure 4. in such a way that the two discrete audio synthesis networks regulate one another’s behavior in a manner determined by the output of the functions x and y.12.12.1 and L2. Example 1: Pushing the envelope Figure 4. then. Feely in use. The example in figure 4. and vice versa. Or. At the present writing. more specifically. through the addition of two local busses.12.4.4 Mr. I have chosen these specific examples because of their differences. however.13 illustrates an extension of the interacting composite network of figure 4.

they 129 .e. these busses are mapped to parameter slots of mapping functions that are internal to the system. Feely’s Channel Section.1 and L2.13.. where the output of function a serves as a continuous input. Rather than being mapped to parameter slots in the nodes that comprise the synthesis network. to function x.1 provide the effective point of access to the system for human action. This is an instance of “functionparameter” mapping.1 and L2. The local busses L1.1 2 L2. i.1 A1 A2 a x b y Figure 4.GLOBAL 1 L1. Local busses L1. Functional covariance. or argument. and the output of function b serves as a continuous input to function y.1 are placeholders for signals from Mr. These signals are mapped to parameter slots of mapping functions internal to the composite audio synthesis network.

13 represent just a partial view of the entire system. This network of mappings forms the basis of a performance scenario I’ve developed for Mr.13—are key to the dynamical responsiveness of this particular network. which utilizes five discrete audio synthesis networks and assigns three local busses to each network.13) represent composite functions: amplitude followers (on the signals at A2 and A1 respectively) modulated by the output of a logistic mapping function: xn+1 = µxn(1 . and is covariant with human action. is largely determined by the functional mapping from the local busses to the parameter nodes of the global busses. These mappings vary across different implementations of the basic system. the functions x and y (figure 4.1—the busses that are shown in figure 4. The functional mappings from the local busses L1.” The mappings illustrated in figure 4.represent “function-parameter” mappings. The way in which the output signals of the coupled channels regulate one another’s behavior. corresponding to the five channels of three knobs that comprise Mr.13 are mapped to various parameter nodes in the respective discrete audio synthesis networks. The two busses per channel that are not shown in figure 4. In the “pushing the envelope” model. It’s their role that I will focus on here. Feely that goes under the working title “pushing the envelope.1 and L2.xn) 130 . Feely’s Channel Section. but in all instances map into continuous ranges as suitable to the synthesis parameter in question. then.

which results in the system as a whole having response 131 . the greater the amplitude of the resulting signal. that the more “active” the activity. the corresponding knobs in Mr. already embody the potential for complex dynamical behavior.1 and L1.The outputs of x and y are connected as level controls at the output stage of channels 1 and 2 respectively. The functions a and b in figure 4. The mapping functions x and y. where the dynamical contour of the modulated signals derived from A2 and A1 may be more or less chaotic or “flat” depending on the assignment of a constant value to µ. effecting a coupling between the two channels. The effective ranges of a and b are scaled to a dynamically rich range in µ (between 2.13 represent the slope (rate of change) of the signals at busses L1. It represents a simple nonlinear system. 1]). then. and is entirely unstable when µ is greater than 3. The parameter slots in the mapping functions x and y (figure 4. the response of which becomes increasingly chaotic when the value of µ is greater than 3. a range that encompasses the discontinuous transition from flat to chaotic dynamics through successive period doublings). essentially.9 and 3.2 respectively. As the outputs of a and b are effectively plugged into µ.13) represent the variable µ in the logistic mapping function. This means.e.87. the dynamical contour of the outputs of x and y are directly proportional to the rate of performer activity. Feely’s Channel Section— that are connected to the bus. The amplitude of this function’s output will vary proportionately with the rate of performer activity at the hardware controls—i. The logistic mapping function is interesting because the trajectory of its orbit varies with different values of the variable µ. This creates for a potentially very interesting mapping.87 (assuming values of x in the range [-1.

14) will result in a proportionate increase in the “degree of chaos” in the outputs of functions x and y. i.characteristics that vary dynamically with the “push-and-pull” of human motor actions. it requires a considerable investment of performative energy. an increase in the rate of left-right knob “twiddling” with respect to time (figure 4. the response of the system is dynamically flat.. Thirdly.e. the system doesn’t just require the performer. Firstly. Secondly. because of the way the system is calibrated—specifically the “tuning” of the logistic map variable µ in relation to the rate of change of motor activity—it requires a performer. the “pushing the envelope” model has certain interesting implications for performance. To that extent. without performer action. TIME Figure 4.14. the system requires considerable physical effort on the part of the performer to elicit dynamically rich responses from the software system. Left-right knob manipulation with respect to time. In practice. For example. the behavior of the system as a whole is far from 132 .

In my experience thus far with this system. as encapsulated in the functional mappings from outputs in one channel to parameter nodes in another. Nonetheless. although the performer may place the focus of activity at any one moment within a specific channel—and the human anatomical constraint of two-handedness tends to determine this kind of pattern in performance—the effects of that activity will nonetheless be felt throughout the composite network comprised of all five channels. That is to say. and particularly on the way that dynamical changes propagate through the composite network. I’ve found that it’s not possible to get an overall conceptual grasp on its range of behavior. Example 2: Surfing the fractal wave (at the end of history) In certain respects. The key implication of these interdependencies is that performative actions directed toward a single channel of controls will have consequences beyond the scope of the discrete audio synthesis network to which those controls are connected.transparent at first use. certain recurrent patterns of motor activity have begun to emerge. The complexity of the system’s dynamical responsiveness is effectively guaranteed by the interdependencies of the five discrete audio synthesis networks. also continue to yield new and often surprising dynamical contours. at the same time that they continue to be more closely aligned to certain expectations. there are parallels in the dynamics of the “pushing the envelope” network to the dynamics of many conventional acoustic instruments. 133 . and in fact demands significant experimentation before certain consistent patterns and responses begin to reveal themselves. and these patterns are yielding varieties of sonic responsiveness that.

org/LOGOS/tm970423. has very little to do with McKenna's original intention. for example. Where performance with conventional acoustic instruments ordinarily requires a “pushing” of kinetic energy into the instrumental mechanism in order to set things in motion. There is. 134 . then. things are already in motion in the instrumental mechanism.e. is more concerned with giving dynamical shape and contour to these motions. an “absorbed coping” that is about the timely navigation of energy flows in the environment.abrupt.html).When there is no input of human energy. a particular way in which the model requires the performer: it requires a “pushing”—a directed expenditure of kinetic energy—to actualize the dynamic potential that is immanent to the network.. the system’s dynamical responsiveness is proportionate to the amplitude of that energy. The model I outline in this section—“surfing the fractal wave (at the end of history)”3—embodies an altogether different kind of resistance and affords an altogether different variety of motor activity. rather than the directed transmission of energy flows that originate in the body. Patterns of motor activity in “surfing the fractal wave” are designed around the asymmetry of “handedness” (Guiard 1987). dominant and non-dominant 3 The name is borrowed from the title of a 1997 Terence McKenna lecture (http://www. then. the instrument’s response is “flat. i. however. in the “surfing the fractal wave” model. My appropriation. The mode of performance. Hence the distinction between “surfing” and “pushing” analogies.” And when human energy is transmitted to the system.

a tendency that is self-reinforcing across a wide range of activities and over repeated performances. but they cooperate in the accomplishment of the larger task that those sub-tasks comprise. 3. (Kabbash. For example. For example.15 represents a partial view of the “surfing the fractal wave” network model.hands are afforded independent sub-tasks. 2. The sequence of motion is left then right. while the right hand holds the brush and does the fine strokes onto the canvas. advocate the design of humancomputer interfaces that exploit the habitual ways in which humans tend to use their hands in skillful activity. For example the left hand brings the painter’s palette in and out of range. Kabbash. Buxton. the left hand grips the paper. in hammering a nail. and Sellen 1994:418) Each of these examples could be viewed as aspects of a single embodied tendency. The left hand sets the frame of reference for action of the right. then the right starts to write with the pen. Figure 4. the left hand holds the nail while the right does the hammering. 135 . The granularity of action of the left hand is coarser than that of the right. Buxton and Sellen describe three characteristic ways in which the two hands are asymmetrically dependent in select everyday tasks: 1. The “surfing the fractal wave” model heads in this direction. Kabbash et al.

LH GLOBAL JSX a SEQ JSY b RH 1 C1.1-3 and C2. only two are shown). The x and y outputs of a joystick with global scope (JSX.1 C2.2 C1. mapping functions and audio synthesis network schemata have been omitted for clarity. 136 .1-3) read signals from the knobs in Mr. “Surfing the fractal wave” network model.2 C2. These controls “filter” the results of the mapping from the sequencer stream to each of the discrete audio synthesis networks. JSY) are mapped to parameter slots of a chaotic sequencer function (SEQ). Local busses (C1. Joystick manipulations are always performed by the left hand.15.1 C1. Knob manipulations are in most instances performed by the right hand.3 Audio Network Figure 4. The sequencer sends a stream of timed triggers to parameters in each of five discrete audio synthesis networks (for clarity.3 Audio Network 2 C2. Some feedback networks. Feely’s Channel Section.

16. the pads of the left hand fingers tend to “ride” the joystick. with a regular and stable amplitude pattern (figure 4. Sequencer pulse stream when the joystick is in centre (“resting”) position. 4 The "chaotic" sequencer function is not technically chaotic (in mathematical terms). at a medium tempo. the sequencer clock outputs a steady stream of pulses.16). 137 .The diagram divides the network space into left hand and right hand regions. In performance.4 The sequencer is calibrated in such a way that its output is more or less stable when the values of the mapping functions a and b are close to the centre of their effective ranges. where certain gestural patterns emerge in response to the dynamical properties of the “function-parameter” mappings of the global busses JSX and JSY (placeholders for continuous signals from the x and y axes of the joystick. In practice this means that when the joystick is in its centre position (the resting position for a “spring-back” style joystick). respectively) into the output of a chaotic sequencer (SEQ). Time Figure 4. The designation can be taken to be qualitative.

An increase in the value of both of these parameters (corresponding to a bottom-to138 . The parameter slot to which a is mapped represents a multiplication argument for the sequencer’s clock frequency and base amplitude. the degree of pulse “nestedness.The mapping functions a and b determine.17). Time R JSX L SEQ Figure 4. In short. An increase in the signal at JSX. the probability that successive values read from an internal finite state machine are mapped to the amplitude of the pulse stream. this single variable determines two aspects of the sequencer’s behavior: 1. however.” and 2. The parameter slot to which the mapping function b is connected represents a chaotic variable in the sequencer function. then—corresponding to a left-to-right movement across the joystick’s x axis—results in an increase in the pulse stream’s frequency and amplitude (figure 4. that deviations in the x and y axes of the joystick result in more complex behaviors in the pulse stream. Sequencer pulse stream when there is a left-to-right movement across the joystick’s x axis.17.

18. Time R JSX L T JSY B SEQ Figure 4. The increase in the signal at JSY results in a greater 139 . The output of the pulse stream shows the trajectory towards a higher “degree of chaos” over time. where pulse “nestedness” implies a greater likelihood of frequency multiplication from one pulse to the next (and therefore a greater likelihood of extra pulses being “nested” into the pulse stream). and where the irregularly patterned output of the internal finite state machine incrementally encroaches on the otherwise linear behavior of the amplitude mapping in the mapping function a (corresponding to the left-to-right movement across the joystick’s x axis).17.top movement in the joystick’s y axis) results in an increase in the system’s entropy. and a bottom-to-top movement across the y axis. Figure 4.18 adds a bottom-to-top movement in the joystick’s y axis to the left-to-right movement in the x axis illustrated in figure 4. Sequencer pulse stream when there is a left-to-right movement across the joystick’s x axis.

These resonators embody different resonance models (with different dynamical responses). and between the kinds of responses that right hand actions might elicit from each of the networks. the transitions between then. and a greater likelihood of irregularities in amplitude patterns. While the joystick operates across two degrees of freedom—the x and y axes—the performer does not break the activity down into separate movements in two dimensions (as figure 4. And it’s in these motions that a “feel” develops for the sequencer’s stable and chaotic regions. Rather. Each of the five synthesis networks implements a resonator function. there are strong symmetries between their behaviors.likelihood of “nestedness” in the pulse stream. the performer guides the left-hand through singular trajectories across a two-dimensional space.18 would indicate). but there are certain 140 . and for the shift from greater-to-lesser and lesser-to-greater degrees of event density with respect to time. where the pulses that are mapped into each of network serve as excitors. The perceptual guiding of left-hand actions in “surfing the fractal wave” is more integrated than figure 4. The output of the chaotic sequencer is mapped to parameters in each of the five discrete audio synthesis networks. there can be no complete picture without considering how these sub-tasks coordinate and cooperate.” And while it’s useful to break the activity down into left and right hand sub-tasks.18 would suggest. While each of these networks encapsulates different dynamical responses. But these motor patterns constitute only one part of the coordinated left hand/right hand movements that amount to “surfing the fractal wave.

1 C1.2. This has the 141 . High level “percepts” are symmetrical across each of the five channels.2 C2.3. Perceptual symmetries in the functional mapping from busses to the audio networks across distinct channels. That is. and “Resonance” corresponds to busses C1-5.5 Figure 4.3 Audio Network 3. Feely’s Channel Section correspond to rows of busses in the diagram.19 shows the mapping from local busses to two of the five discrete audio synthesis channels.4.” “Resonance”) are assigned to corresponding busses across each channel. Figure 4. “Gate” corresponds to busses C1-5. The symmetry holds at the level of hardware.1 C2.” “Width.perceptual constants from one network to the next.2 C1.1. where each of those percepts corresponds to the same bus number assignment in each channel. 1 Pulse Stream GATE WIDTH RESONANCE 2 C1. Percepts (“Gate. “Width” corresponds to busses C1-5. where rows of knobs in Mr.3 Audio Network C2.19.

their bandwidths. i. these “percepts” require a symmetry in terms of the effect of functional mappings into each of the discrete audio synthesis networks if their particular perceptual qualities are to be discerned and distinguished.5 probability of passing when the gate’s value is 0. a tighter “elasticity” (implemented as a shorter impulse response in the delay lines in the resonator’s filterbank) will result in shorter output events. It acts. the resonant frequencies.effect of similar classes of response being elicited from corresponding knobs in each of the five channels of Mr. The implementation of the “Width” mechanism varies slightly from one channel to the next. where no pulses are passed to the resonator system when the gate’s value is zero. Feely’s Control Section. all pulses are passed when the gate’s value is one. but its effect is symmetrical: turning the corresponding knob from left to right has the effect of “loosening the elasticity” of each resonator. as an event filter on the pulse stream. The “Resonance” mechanism is the most varied in terms of implementation across the five channels. and each pulse in the stream has a 0. then. and the ways in which the filters that 142 .5.. It is tied in specifically to parameter nodes in the resonator that change the resonator’s dynamical responsiveness.e.e. i. whereas these events will take on longer durations (correlating to the perception of having a greater temporal width) as the resonator’s “elasticity” is slackened. Of course. The “Gate” mechanism is functionally identical across all five channels: turning the corresponding knob from left to right has the effect of allowing a greater number of pulses to pass through a gated input to each resonator.

left hand movements give contour to the dynamical unfolding of the pulse stream. then the right starts to write with the pen”).” there is a correspondence to each of the three characteristic behaviors of bimanual asymmetric action that Kabbash et al. In the “surfing the fractal wave” model. self-oscillation and nonlinear behavior. The left hand sets the frame of reference for action of the right. 2. In the breakdown of right hand and left hand tasks in “surfing the fractal wave. is the frame of reference for the “picking” and “shaping” of discrete events that characterizes right hand actions. point out. 143 . The pulse stream. But unlike Kabbash et al. the respective actions form a continuous interplay of complementary motions—as opposed to a sequence of isolated events—and the transference from left-handed to right-handed motions takes place at a much finer granularity of temporal scale. This follows from the first point: the right hand modifies the event stream only after the left hand has given the stream its dynamical contour.comprise the resonator’s internal filterbank interact.’s corresponding example (“the left hand grips the paper. turning the “Resonance” knob from left to right tends to shift the dynamical response of the resonator increasingly towards distortion. while the right hand acts as an event filter on the stream. The sequence of motion is left then right. and a modifier of the dynamical properties of the events that emerge from pulses hitting the resonator functions. It’s worth addressing each point in turn: 1. as it unfolds. Across all five channels.

such as Kabbash et al. actions that involve a constant “hopping” between the fifteen knobs that comprise the Channel Section. But even this action is of a coarser granularity than the actions designated to the right hand. turning. That the dominant hand should be at the centre of attention in the midst of bimanual action is not a point that Kabbash et al. discuss. and finely detailed turnings and twiddlings of those knobs. It’s interesting to note that in the act of playing. leaving the little finger to move the joystick through the two dimensional plane while the thumb and pointer finger turn the knobs.3. The granularity of action of the left hand is coarser than that of the right. and the engineering of the interface around habitual 144 .’s corresponding example: “the left hand brings the painter’s palette in and out of range. I’ve found that in playing with the model.” The two key aspects to the model of activity in “surfing the fractal wave” are the “surfing” aspect. or other finger motions that are performed at a fine granularity of scale. Feely’s Channel Section. while the right hand holds the brush and does the fine strokes onto the canvas. In “surfing the fractal wave” the left hand is designated to control the joystick. left hand activities do not seem to require any conscious attention. while the right hand activities demand on-going and focused attention. but it seems that my experience of this phenomenon with “surfing the fractal wave” might also apply to other activities. These joystick manipulations do not require the hand to reposition itself across discrete points on the control surface. my left hand will often span the distance from the joystick to the top row of knobs in Mr. and they do not require grasping.

in which events are initiated when the performer transmits kinetic energy to the instrumental mechanism. I’ve had a better capacity to deal with the system’s unfolding in a timely manner. the entangling of these aspects in the midst of performance—that give the model its idiosyncratic kind of resistance. Motor patterns. motor and cognitive competencies that is definitive of enaction. the “surfing the fractal wave” model is built around a persistent stream of events. as a function from input to output comprised of a series of discrete and causally related steps. where the desired outcome of the function is known in advance of its execution. but in the coordination of the hands with respect to timing constraints. This seems to me indicative of the coevolution of sensory. It’s been interesting to note that. more specifically. In contrast to the “pushing the envelope” model. then. And these events can go by very fast. over the period of time that I’ve worked with this model. and this in turn has led to a higher level of detail and nuance in both the shaping of individual sounds at the event level.embodied patterns of “handedness. Summary The conventional metaphors of computer science tend to regard computation as an inherently sequential process. emerge not only in the interdependencies between the two hands. This is at odds with the enactive model of interaction. That is. and in the elaboration of larger scale events. and where the behavior of those components. and as my hands have become both better coordinated and more individually dexterous.” It’s these aspects—or. and 145 . such as phrases and gestures. where activity takes place across a network of interacting components.

however. Rather than falling back on the “computation-as-calculation” model. a given property of the system. computation would be viewed as a process in which “the pieces of the model are persistent entities coupled together by their ongoing interactive behavior (Stein 1999:483). or so ungraspable that they forever remain beyond motor and cognitive capability. An enactive digital musical instrument. the kinds of resistance that the instrument affords to the human remains a matter of how the infrastructure is utilized. is adaptive and emergent with respect to the ongoing push-and-pull of interactional dynamics. will depend on a fundamentally different view of computation to that of conventional computer science. Structural coupling is not. however. to a kind of physical model.therefore of the network as a whole.e. Feely’s software system. And the “right” kinds of resistances—at least with a view to structural coupling. and to that extent it also allows for a structural coupling of performer and instrument.” This model of “computation-as-interaction” underlies the design of Mr. The examples I’ve outlined in this section point. The system allows for human action to be folded into the dynamical processes of interacting network components. i. I suggested in chapter 2 that while there is much to be learned from the physical modeling of conventional acoustic instruments. while the software system provides the required technical infrastructure. in that they embody networks of dynamical 146 . the focus of my work is directed more towards the development of instrumental behaviors that are indigenous to computing media. realization and enaction—will be those that are neither so transparent to human action that they demand little thought or effort. a matter of design.. then.

and new ways of encountering the instrument. leaving the designer free to experiment with any manner of sound-producing and processing components.dependencies in which human action is resisted by forces that are immanent to the software network. I’ve found that the “right” kinds of resistances. the performer will continue to realize new practices. takes a middle course between normative and speculative modes of interactivity. then.e. I’m suggesting that it’s through this shift that we see the potential arise for what I have called an “indigenous” computer music. So.. i.. this means that the simulated physics of resistance will be—in some way or other—functionally related to physical descriptions of real world behavior. while the components of the audio synthesis network certainly continue to play a critical role in the instrument’s behavior. the virtual physics of these systems is speculative. then the kinds of resistance that the systems afford will be sufficiently rich in dynamical potential that. or on differential equations that describe well known physical systems. 147 . however.e. continue to be those that are resonant with phenomenal experience and past practices of embodiment. the focus of development is shifted to the mapping framework. The design of these systems. In the approach I’ve taken. But in contrast to physical models of conventional instruments. Rather. design choices as to “kinds of resistance”—i. they are evolved interactively through experimentation with various mapping and calibration schemes. Essentially. If the balance between these two poles is apposite. over a sustained period of time. classes of behavior—are effectively decoupled from audio synthesis implementations. between that which is familiar and that which is other to every day phenomenal experience. the models are not based on data from real world measurements.

would lie with the way in which models might be generated from a consistent but open-ended application of principles that emerge from the interaction between philosophical and technical problematics. these “uncoverings” will necessarily require the development of patterns. At this point in my work. this may appear to contradict my observation at the beginning of this chapter that the task of arriving at a universal template for the design of enactive instruments may be ultimately impracticable.4. In my work with Mr. this will likely be a matter of evolving a body of general principles that might be employed such that design knowledge can be added to incrementally. as both designer and performer. and that there are a great many implementational possibilities yet to be uncovered. that are of a higher order than those I’ve outlined to this point. For design. This would be a kind of meta-design. there would exist an evolving metric for balancing the constraints of one against the other in an integrated framework. It also seems that at a certain point. At first glance. in both design and performance. I can’t say for certain how one would go about putting 148 . it seems I’m still just scratching at the surface of these matters.5 Prospects The two usage examples I’ve outlined in this chapter demonstrate just a small number of possible approaches to engineering the kinds of resistance that digital musical instruments might store in potentia. The concern. But the issue I’m raising here is more directly concerned with arriving at general principles that operate at a higher level of abstraction than purely implementational concerns. Rather than persistently hopping back and forth between philosophical and technical discourses. rather. Feely.

149 . In considering the merging of the two models into a single integrated model. It’s interesting to consider. At the same time that this may eventually lead to more complex and diverse sonic utterances. In designing for multitasking. Again. the issue comes back to design. the two usage examples I’ve outlined in this chapter embody very different kinds of resistance. it may also lead to a heightened sense of flow—of performative embodiment. The development of higher order patterns in performance is also a matter of balancing opposing constraints. But problems such as these are not without precedent in the history of design. it may prove useful to have in store some metric of actional distance between the kinds of 5 For example. And while the two usage examples I outlined in the previous section might involve a certain degree of multitasking in and of themselves. how these models might be interleaved in the context of the same performance. This kind of multitasking is part and parcel of expert musicianship. though. As I’ve been careful to make clear. then. see Alexander ([1964] 1997). and therefore afford very different varieties of human action. there is a higher order of multitasking that could potentially encompass both models simultaneously.such a framework together.5 and it seems to me a potentially very productive avenue of investigation. it would seem that they are in fact so different in playing technique as to be incompatible. Multitasking must necessarily involve some degree of compatibility between the actional patterns that comprise the sub-tasks.

This has certainly been the case with conventional acoustic instruments—and is perhaps definitive of so-called “extended” techniques—and there’s no reason to assume that the situation should be any different for digital musical instruments. the more I play with the “surfing the fractal wave” model. then. It’s been interesting for me to note that.7 These kinds of discoveries constitute an important aspect of the learning process. such approaches are not without precedent in design. see Wild.motor activities that different models afford. not just because they can be assimilated into the accumulating motor and sonic vocabulary. is simply folded into the enactive model of interaction. the "pushing the envelope" 7 model has yielded no such interesting anomalies.6 With or without these higher order design methods. at least to this point. Its appearance or suppression in performance becomes a 6 For example. but because in certain cases they can lead to entirely new avenues of investigation—avenues that would have remained closed had the system been insulated from random environmental inputs in the first instance. the products of design will invariably afford opportunities for action that were at no point factored into the design process. It's also interesting to note that. But again. There is a stochastic element in enactive process. The balancing of these constraints may prove to be difficult. The glitch. the more I’m able to isolate certain quirks and glitches in the system. Johnson and Johnson (2004). 150 . and this element is accounted for in the contingencies of environmental dynamics.

Either choice will lead to the appropriate refinement of actional dispositions. 151 .matter for human intentionality.

5 Groundlessness Whatever comes into being dependent on another Is not identical to that thing. each living individuality. Therefore it is neither nonexistent in time nor permanent. Mūlamadhyamakakārikā XVIII:10 The important thing is to understand life. between deceleration and acceleration of particles. or a development of form. Spinoza: Practical Philosophy The main thing is that you forget yourself. not as a form. — Gilles Deleuze. but as a complex relation between differential velocities. — Nagarjuna. Nor is it different from it. — Barbara McClintock 152 .

linkages. the unavoidable products of a subject/object syntax. heterogeneity. active and embodied participation in the dynamical unfolding of real time and space—and it’s the same non-self that vanishes the moment that attention is turned inward.. Thompson. but with relations.” that enactive theory necessarily implies a “groundless” or “selfless” self—i. and my writing in this essay has not been immune to the lopsided characterizations of interaction that such products embody.e. Rosch and Thompson’s outline of an enactive cognitive science is the model of subjectivity that necessarily follows from enactive process. and perception is geared towards abstract contemplation of the objectness of things in the world. 153 . and Rosch 1991:116). a process that is not concerned with subjects and objects. I’ve sought to describe the inherent circularity of the continuous interactional unfolding that is definitive of enactive process. and the dynamic momentum of the emergent system that arises in the relations and linkages between heterogeneous elements.” This is the non-self that appears in the experience of flow—in an unselfconscious. of course.” “technologies determine humans.” and so on. These are. It’s precisely because enactive process concerns “the processual transformation of the past into the future through the intermediary of transitional forms that in themselves have no permanent substance (Varela.The structure of our language typically leads us to characterizations of interaction that focus on one side or the other of the interactional loop.” a “subjectless subjectivity (Deleuze and Guattari 1987). But despite the inevitable linguistic constraints. One of the more radical outcomes of Varela. “Humans use technologies. a self with “no permanent substance.

is directed towards designing an encounter.e. 154 . it’s directed towards designing something-in-order-to-not-besome-thing. a conclusion. Or. It would be easy enough to arrive at the conclusion that.. If an implementation might afford the potential to undermine essentialist ways of being—i..e. in designing a digital musical instrument.In this essay. it affords a particular utility. While the statement is obviously true. it’s precisely these implications that are most critical when thinking about design. An enactive approach to digital musical instrument design would necessarily account for the realizational potential of the instrument. But to my mind (however that may now be defined). i. a potential which would lead to an incremental unfolding of relationality. At various points throughout the essay. then. and which at the same time would serve as the measure of the instrument’s resistance. an equipment is a tool that presents itself to human perception and intentionality as something-in-order-to. computers have a significant potential. I think. I have not dealt with the epistemological or ontological implications of an enactive approach to design in any significant manner. I’ve invoked Heidegger’s use of the term “equipment. That is. we are designing something-in-order-to-perform-music. or when implementing implementations. and that the epistemological and ontological qualities that it embodies necessarily imply an ethics.” In Heidegger’s terminology. The concern for design. if the performative way of being that it brings about is concerned with the unfolding of relations rather than the ordering of things—then I would say that the implementation in question has utility. it is not. In this respect.

C. ed. Ashby. Notes on the synthesis of form.” In P. Agre and S. ed. CA: International Computer Music Association. Pachoud. Computation and human experience. 1997.1-52. Cambridge. P. “Sensorimotor transformations in the worlds of frogs and robots.Bibliography Agre. Cambridge. Alexander. T. Barbaras. 2001. Cambridge.” In B. Anderson. J. pp. New York: Da Capo Press. Arbib. [1964] 1997. 1992. Computational theories of interaction and agency. Computational theories of interaction and agency.44-51. R.-M.. Bahn.” In Proceedings of the 2001 International Computer Music Conference. MA: Harvard University Press. “Computation and embodied agency. UK: Cambridge University Press. “Physicality and feedback: A focus on the body in the performance of electronic music. Varela.130-142. Bailey.-S. C.” In P. pp. ———. 2002. D. “The movement of the living as the originary foundation of perceptual intentionality. R. [1952] 1960. Liaw. MA: MIT Press. MA: MIT Press. ———.” Computer Music Journal 29 (3):11-28.53-79. Cambridge. “Computational research on interaction and agency. pp. “The practical logic of computer work.” Informatica 19 (4):527-535. Improvisation: Its nature and practice in music. ed. pp. Cambridge. ———. 1996. London: William Clowes and Sons. Trueman. J. and D. and J. Petitot. 1999. Roy and F. Design for a brain: The origin of adaptive behaviour. Agre and S. [1956] 1965. Rosenschein. 2005. Rosenschein. Naturalizing 155 . ———.. MA: MIT Press. C. M. W. 1996. “Dynamic networks of sonic interactions: An interview with Agostino Di Scipio. 1995. San Francisco. New York: John Wiley & Sons.” In Computationalism: New directions. Hahn. An introduction to cybernetics.

R. Raymond and M. Adamson. Intelligence as adaptive behavior: an experiment in computational neuroethology. 1980. “A dynamical systems perspective on agent-environment interaction. R. 1993.” Science 253 (5025):1227-1232. San Diego. New York: Continuum. Outline of a theory of practice. 1991. Trends in gestural control of music. Brooks. 1977. The field of cultural production: Essays on art and literature. Stanford. Paris: IRCAM. Cambridge. G.” Trends in Cognitive Sciences 4 (3):91-99.K. The logic of practice.: Polity. Mind and nature: A necessary unity.K. 2005. MA: Harvard University Press.” Robotics and Autonomous Systems 20:257-289. pp. G.525-538. Bongers. Toronto. pp. “Herbert Brün: Project sawdust. R.: Cambridge University Press. Nice. T. ———. ———. CA: Academic Press Professional. CA: Stanford University Press. “Dynamical approaches to cognitive science. U. trans. Computational theories of interaction and agency. trans. 2000. Borgo.phenomenology: Issues in contemporary phenomenology and cognitive science. ed. 1991. 2004. Agre and S.41-70. Bateson. Sync or swarm: Improvising music in a complex age. “Autopoiesis and cognition in the game of life. ed. “Physical interfaces in the electronic arts: Interaction theory and interfacing techniques for real-time performance. 1979. Battier. 2000. Cambridge. Blum. 1997. MA: MIT Press. Rosenschein. 1990. New York: Bantam Books. P. ———. ———. 1991.” Computer Music Journal 3 (1):6-7. D. “The dynamics of adaptive behavior: A research program. Cambridge. ———.” In M. pp. “New approaches to robotics. Beer. 1990.” Artificial Life 10:309-326. U. Inc. Cambridge.173-216. ———. ———. 156 . Wanderley and M. Language and symbolic power.” In P. B.” Artificial Intelligence 47:139-159. “Intelligence without representation. ———. D. Bourdieu. 1996. New York: Columbia University Press.

3-10. R.” Leonardo Music Journal 3:11-16. 2001. “The aesthetics of failure: ‘Post-digital’ tendencies in contemporary computer music.” Computer Music Journal 24 (4):12-18. pp. 1969. Petitot. ed. W. Hillsdale. Acts of meaning. 1986. The web of life: A new scientific understanding of living systems. Brün.319-337. 2003. H. Cascone. system: Three levels of reception in the performance of laptop music. 2003. A. New York: Anchor Books. W. “Formal structures in the phenomenology of motion. Beauchamp. Cambridge. ed. Burzik. ———. Varela. sequence. 1987. Stanford. 2003.” Parachute (107):56. CA: Stanford University Press.” The Strad:714-718. “Grain.” In H. K. Music by computer. X. 157 . instruments and networks. 1999. F. “Laptop music . “Artificial life and real robots. Pachoud.counterfeiting aura in the age of infinite reproduction. Draper.” Contemporary Music Review 22 (4):1-2. Capra. MA: Harvard University Press.” Leonardo Music Journal 11:43-49. 1966-2000. Cambridge. “Infraudibles. Bourgine. ———. “Plus ça change: Journeys. New York: John Wiley and Sons. “There’s more to interaction than meets the eye: Some issues in manual input. “Introduction. Chabot. Buxton. pp. Naturalizing phenomenology: Issues in contemporary phenomenology and cognitive science. Norman and S. 1991.372384. 2000. Bruner. MA: MIT Press. NJ: Lawrence Erlbaum Associates.” In D. Casati. User centered system design: New perspectives on human-computer interaction.” Contemporary Music Review 22 (4):101-104. Toward a practice of autonomous systems: Proceedings of the first European conference on artificial life. ed. L. possible worlds.” In F. Actual minds. J. ———. Von Foerster and J. ———. pp. MA: Harvard University Press. J. Roy and F. ed.-M. J.———. 1994.” In B. Cambridge. 1996. “To listen and to see: Making and using electronic instruments. 1992. “Go with the flow. 2002. Varela and P. Casserley.

” In E.K.” Contemporary Music Review 9:207-220. Proceedings of the 2002 New Interfaces for Musical Expression Conference. “The limitations of mapping as a structural descriptive in electronic instruments. “A manifold interface for a high dimensional control space.. body and world together again. ———. 2003. Chalmers. “Moving minds: Situating content in the service of real-time success. ———. MA: MIT Press. 1995. Cambdrige.: Cambridge University Press. and D.” In Proceedings of the 2003 New Interfaces for Musical Expression Conference. Cambridge. 1932.385-392. A. ———. A. 1998. 1936. Collins. “Generativity. F. Clarke. ———. 1997. N. second series 33:346-366. 2002. Church. U. pp. Situated cognition: On human knowledge and computer representations. Clark. 2003. pp. Choi. Chiel. pp. Oxford: Oxford University Press. and R.” Trends in Neurosciences 20 (12):553-557. ed. Brazil.” Analysis 58 (1):7-19.. technologies.201-204. NJ: Prentice Hall. Beer. I.” In Proceedings of the 1995 International Computer Music Conference. E. A. J. “The brain has a body: adaptive behavior emerges from interactions of nervous system. “A component model of gestural primitive throughput. Clancey. W.” Contemporary Music Review 22 (4):67-79. H. 2003. Being there: Putting brain. “A set of postulates for the foundation of logic. Clark.197-201. D. mimesis and the human body in music performance. 1997. J. 1997. CA: International Computer Music Association. 1997. ———. Upper Saddle River. “Generative music and laptop performance. body and environment. Natural-born cyborgs: Minds. Electric sound: The past and promise of electronic music.” Annals of Mathematics.Chadabe. “An unsolvable problem of elementary number theory. “The extended mind. San Francisco. 158 . and the future of human intelligence.” American Journal of Mathematics 58:345-363. 1993.” Philosophical Perspectives 9:89-104. 1995.

Guattari. H. 2001. 2004. M. 1983. New York: Putnam. Dedieu. and E. and F.. Anti-Oedipus: Capitalism and schizophrenia. reason and the human brain. 1987. ———. Deleuze. Descartes’ error: Emotion. 1994. New York: Harper Perennial. M. trans. R. ———. Toward a practice of autonomous systems: Proceedings of the first European conference on artificial life.” Journal of New Music Research 33 (3):315-320. J. Cull. trans. Varela and P. 1991. Tomlinson and B. 1992. New York: Zone Books. San Francisco. G. ed. ———. pp. Di Scipio. Difference and repetition.” In F. Damasio. A. 1990. De Certeau. Csikszentmihaly. 1994. trans. Cambridge. “Formal processes of timbre composition: Challenging the dualistic paradigm of computer music. R. ———.Cook. CA: International Computer Music Association. Massumi. pp.” In Proceedings of the 1994 International Computer Music Conference. trans. 2000. B. CA: University of California Press. Cook. New York: Columbia University Press. Habberjam. Spinoza: Practical philosophy. Berkeley. Hurley. “Principles for designing computer music controllers.. Deleuze. 1988. “The circularity of living systems: The movement and direction of behavior.88-95. 159 .” In Proceedings of the 2001 New Interfaces for Musical Expression Conference. Flow: The psychology of optimal experience. San Francisco: City Lights. Minneapolis: University of Minnesota Press. “Remutualizing the musical instrument: Co-design of synthesis algorithms and controllers. P. Bergsonism. Patton. A. A thousand plateaus: Capitalism and schizophrenia. trans. Bourgine. M. E. New York: Zone Books. 1984. Expressionism in philosophy: Spinoza. The practice of everyday life. P. Joughin. R. Minneapolis: University of Minnesota Press. [1968] 1994.202-208. MA: MIT Press.” Journal of Applied Systems Studies 1 (1):51-65. G. P. 1991. “An approach to sensorimotor relevance. Mazer.

What computers still can’t do: A critique of artificial reason. 1993. “’Losing touch?’: The human performer and electronics. Critical theory of technology. Cambridge. 2000.ics. ed. 2001. 1990. 2005. Questioning technology. 1991. H. “Heidegger’s critique of the Husserl/Searle account of intentionality. Evens. Cambrdige. ———.” In E. http://www. Feenberg. A. Oxford: Oxford University Press.edu/~jpd/publications/misc/embodied. “Computationalism. MA: MIT Press. Strong. S.294-315. Aldershot: Ashgate Publishing. Electronic Journal of Analytic Philosophy (4). Technology and the good life? Chicago: University of Chicago Press. Being-in-the-world: A commentary on Heidegger’s ‘Being and Time.louisiana. pp. Dietrich. 2000.spring/dreyfus. pp. ———. ———. 160 . “Interpreting music technology: From Heidegger to subversive rationalization. Embodied interaction: Exploring the foundations of a new approach to HCI. Oxford: Oxford University Press. Minneapolis: University of Minnesota Press. http://ejap. The current relevance of Merleau-Ponty’s phenomenology of embodiment.” In Proceedings of the 2000 Colloquium on Musical Informatics. 2000. ———. 1991. ———. Division I’. ———. 1999. ———.uci.1996. MA: MIT Press. Electronic Media and Culture. Music. 1997.edu/EJAP/1996. 1992. Light and D. P. 2002. A.” Sonus 18 (1):63-80. Where the action is: Foundations of embodied interaction. Cambridge.spring. “Ecological modeling of textural sound events by iterated nonlinear functions.” Social Research 60 (1):17-38. E.33-36. Emmerson. 1999. ed. ———. Dreyfus. Higgs.194-216. and experience.pdf.” Social Epistemology 4 (135-154).html. 1996. London: Routledge. Dourish. Sound ideas: music. A. Transforming technology.” In S. machines. MA: MIT Press. “From essentialism to constructivism: Philosophy of technology at the crossroads. Emmerson.———. pp.

2002. A. and knowing: Toward an ecological psychology.” In P. Moran. 161 . Everyware: The dawning age of ubiquitous computing. Fodor. S. Cambridge. Music. Want. E. ed.247-260. “Haptics in manipulation. ———. Gujar. 2001. 1983. IL. Giunti.43-50. “Cutaneous grooves: Composing for the sense of touch. J. pp. “Embodied user interfaces for really direct manipulation.” Sonus 18 (1):26-44.” Communications of the ACM 43 (9):75-80. 2000. “The theory of affordances. Fitzmaurice.” In E. New York: Houghton-Mifflin. Computation. 1997. Brazil. MA: MIT Press. B.” In Proceedings of the 1997 SIGCHI Conference on Human Factors in Computing Systems. Proceedings of the 2002 New Interfaces for Musical Expression Conference. 1999.Fishkin. Bransford. and R. The modularity of mind. cognition and computerized sound. Music.” In P. Hamman. Greenfield. G. ed. 2006. E. 1979. Harrison. NJ: Lawrence Erlbaum Associates. Buxton. O’modhrain. Guiard. and S. K. “Interaction as composition: Toward the paralogical in computer music.. E. Goudeseune. “The aesthetics of interactive computer music. Hillsdale. M. Shaw and J. pp. 1977. CA: New Riders Press. The ecological approach to visual perception. Oxford: Oxford University Press. 2001. acting. 2003. G. 1999. J. and W.” Journal of Motor Behavior 19 (4):486-517. MA: MIT Press. pp. Cambridge. “Asymmetric division of labor in human skilled bimanual action: The kinematic chain as a model. dynamics and cognition. ed. Berkeley. B. Cook. Garnett. Cambridge. T. cognition and computerized sound. J. space-multiplexed input. 1987. Davenport.261-276. C. 1997. Gillespie. Perceiving. ed.. pp.” In R. “Haptics. G.37-43.. “An empirical evaluation of graspable user interfaces: towards specialized. Cook. A. How the body shapes the mind. M. A.” Computer Music Journal 25 (1):21-33. Composing with parameters for synthetic instruments. Oxford: Oxford University Press. MA: MIT Press. 1997. Gallagher. Gunther. Y. University of Illinois at Urbana-Champaign. Gibson. ———. Urbana-Champaign.

159-174. Mathematics and computers in modern science: Acoustics and music. ed.149-181. 1999. MA: MIT Press. The technical as aesthetic: Technology.———. Chicago: University of Chicago Press. Cambridge. and the composition of music interaction. 2002. J. ed. New York: Routledge. London: SCM Press. biology and chemistry. Westport: Greenwood Press. 2002. “The question concerning technology. New York: Harper and Row. Tabor. How we became posthuman: Virtual bodies in cybernetics. Robinson. 1991. [1927] 1962. 1988. K. Cambridge. ———. Otto Laske: Navigating new musical horizons. M. Haugeland. D. trans. evolution. 1999. interactive emergence. and socialist-feminism in the late ywentieth century. ———.” In The question concerning technology and other essays. literature and informatics. “From technical to technological: The imperative of technology in experimental music composition. 2002. Haraway. Mastorakis. ———. http://www. MA: MIT Press. “Authentic intentionality. pp. 1999. pp. technology. “From technical to technological: Interpreting technology through composition. ed. Artificial intelligence: The very idea.net/~mhamman/papers/montpelier_2000. Basic problems of phenomenology. “Priming computer-assisted music composition through design of human/computer interaction.pdf. A. Hendriks-Jansen. ———. 162 . Catching ourselves in the act: Situated activity. MA: MIT Press. [1949] 1977. “From symbol to semiotic: Representation. H. composition.” In M. ———. Heidegger. interpretation. Cambridge. signification.” Perspectives of New Music 40 (1):92-120.” In J. Macquarrie and E. trans. 2000.” In.” In Proceedings of the 2000 Coloquium on Musical Informatics. J. Scheutz.3-35. ———. 2000. Being and time. business and economics.” In N.shout. “A cyborg manifesto: Science. Athens: World Scientific Engineering Society. Hofstadter. N. 1996. ———. ———. Bloomington: Indiana University Press. and human thought.” Journal of New Music Research 28 (2):90-104. Computationalism: new directions. “Structure as performance: Cognitive musicology and the objectification of procedure. 1985. pp. Hayles.

J. Cambridge.209-212. pp. Kirk. Cambridge. 1983.” In P. W. Bloomington. Albany. Hidden order: How adaptation builds complexity. D. Dialectic of enlightenment. and M. Bloomington. Paradis. Pausch. ———. IN: Indiana University Press. Music Theory Online (1). and artificial intelligence. San Francisco.honing. Horkheimer. 1997. Wanderley. J.1/mto. http://smt. H. Cambridge. Ihde. “Analysis of adaptation and environment.ucsb.03. Proffitt. NY: State University of New York Press. 1995. control. and N. 2000. pp. M. K.. Meaning in musical gesture. ———. Wanderley. 163 .03.html.” In Proceedings of the 2000 International Computer Music Conference. and T. Cambridge. F. trans. Iazzetta. Trends in gestural control of music. Instrumental realism: The interface between philosophy of science and philosophy of technology. Horswill. Technology and the lifeworld: From garden to earth. MA: MIT Press. Emergence: From chaos to order.Hinckley. MA: Perseus Books. 1991. CA: International Computer Music Association.9. Rosenschein.27-34.. 2003. 1993. 1998. Holland. Hunt. J. Adaptation in natural and artificial systems: An introductory analysis with applications to biology. ———.” In Proceedings of the 1997 SIGCHI Conference on Human Factors in Computing Systems.1. 1972. R. 1990.edu/mto/issues/mto.” Journal of New Music Research 32 (4):429-440. “The importance of parameter mapping in electronic instrument design. Kassell. Philosophy of technology: An introduction. Existential technics. 1992. Patten. 1996. MA: MIT Press.367-396. Adorno.. ———. Hutchins. E. New York: Continuum. ———. 1996. D. “Formalization of computer music interaction through a semiotic approach.. 2000. Agre and S. A. pp. Computational theories of interaction and agency. New York: Paragon House. I. IN: Indiana University Press. Cambridge. MA: Perseus Book. Cumming. 2003. M. Honing. MA: MIT Press. ed. “Cooperative bimanual action. 1995.” Journal of New Music Research 25 (3):212-230. ———. Cognition in the wild.9. “Towards a model for instrumental mapping in expert musical interaction. Hunt. Some comments on the relation between music and motion. A. and R. M.

Karmiloff-Smith. 164 . V.” Computer Music Journal 26 (3):23-39. and reason. Buxton. 2003. 1994.180-183. MA: MIT Press. The origins of order: Self-organization and selection in evolution. Kauffman.” In Proceedings of the 2003 New Interfaces for Musical Expression Conference. S. pp. 1987. 1987.201-228. 2002. 2002. 1990. Consciousness and the computational mind. efficient and learnable instruments. Johnson. Chicago: University of Chicago Press. L. Beyond modularity: A developmental perspective on cognitive science. 1994.. Jordan.. Krefeld.” Journal of New Music Research 31 (1):1-10. R.———. Mcfarlane.” In Proceedings of the 2003 Stockholm Music Acoustics Conference. “The embodiment of intentionality. A. P. “The gluiph: A nucleus for integrated instruments. 1992. Mullen. Dynamical systems approach to cognition: concepts and empirical paradigms based on selforganization. and M. “The Hand in the web: An interview with Michel Waisvisz. “Interactive music systems for everyone: Exploring visual feedback as a way for creating more intuitive. Minneapolis: University of Minnesota Press. “The (anti-)laptop aesthetic. Cambridge. Kabbash. Jacob.” Contemporary Music Review 22 (4). Jordà. M. MA: MIT Press. and A. “Integrality and separability of input devices. Oxford: Oxford University Press. “Improvising with computers: A personal survey (1989-2001). 2002. 2003.” In Proceedings of the 1994 SIGCHI Conference on Human Factors in Computing Systems. Sibert. and coordination dynamics. D. 2003. Jaeger. sophisticated new musical instruments. ed. Sellen. R. S. ———. Bodies in technology. S. 2003. Cambridge. pp. MA: MIT Press. 1993.” In W. B.” Computer Music Journal 7 (7):43-55. Jackendoff. S. “Two handed input in a compound task. The body in the mind: The bodily basis of meaning. Kartadinata. embodiment. imagination. pp.417-423. ———.” ACM Transactions on Computer-Human Interaction (1):3-26. Cambridge. “FMOL: Toward user-friendly. Tschacher. T.

” In F. London: Routledge. Loren. MA: MIT Press.” In P. Lakoff. 1980. Varela. The sight of sound: Music. Hemel Hempstead: Harvester Wheatsheaf. and A. representation. Metaphors we live by. Merleau-Ponty.” Interface (20):235-269.” Perspectives of New Music 28 (2):102-110. and M. P. Reidel Publishing Company.483-514. G.. 1998. 1999. Dietrich. Beskin. 165 . J. ed. 1991. Latour.” Semiotica 1 (3):69-97.. 1987. Morrison. ed. Rosenschein. R. Holland: D. Evanston. Agre and S. E. The visible and the invisible.Lakoff. Lyons. “What is means to be ‘situated’. “Learning behavior networks from experience.. Smith. 1993. D. Lidov. “A view from the bus: When machines make music. ———.” Cybernetics and Systems 29:751-777. M. A. The phenomenology of perception. We have never been modern. and F. B. Where mathematics comes from: How the embodied mind brings mathematics into being. and J. Minsky. Bourgine. Lansky. New York: Basic Books. trans. E. 1996. Leppert. Philosophy in the flesh: The embodied mind and its challenge to western thought.48-57. The society of mind. Maes. 1986. Maturana. 1968. Cambridge. C. [1945] 2004. Chicago: University of Chicago Press. Johnson. R. 1993. Computational theories of interaction and agency. “Mind and body in music. H. G. Autopoiesis and cognition: The realization of the living. 1980. 1990. M. 1987. Boston: New Science Library. CA: University of California Press. New York: Simon and Schuster. Lingis. “Exploiting patterns of interaction to achieve reactive bahvior. Toward a practice of autonomous systems: Proceedings of the first European conference on artificial life. 1992. IL: Northwestern University Press. D. Hendriks. Laske. The tree of knowledge: The biological roots of human understanding.. Dordrecht. “Toward an epistemology of composition. 2000. pp. P. and R. trans. Varela and P. New York: Basic Books. J. L. M. ———. pp. and the history of the body. O. MA: MIT Press. Núñez. C. ———. Berkeley. Cambridge..

and S. University of York. 1986. ed. “Interactive gesture music performance interface. The design of everyday things. and V. Holland. Varela. 1999. London: Guildhall Music and Drama Annual. ———. MA: MIT Press. J. Ng. Norman and S. C. NJ: Prentice Hall. Action in perception. Cambridge.. Pachoud.. Radical user interfaces for real-time musical control. “Creative aspects of live electronic music technology. D. the personal computer is so complex. Roy and F. 1999. MA: MIT Press. ———. The invisible computer: Why good products can fail.” In. A. “Cognitive engineering. 1974. User centered system design: New perspectives on human-computer interaction. NJ: Lawrence Erlbaum Associates. eds. L. “Human bodies. Norman.Mulder. J. Hillsdale. Englewood Cliffs. Hillsdale. 1986. and information appliances are the solution. pp. 1967. computer music. B. Appleton and R.” In B. New York: Basic Books. 1999. Petitot. ed. Draper. O’day. Norman and S.” In D. NJ: Lawrence Erlbaum Associates. A. 1999. Draper. and E. ed. Norman.. D. Noë. 2002.” In D. MA: MIT Press. D. B. ———. 2002. 2002. “Affordances. “Direct manipulation interfaces. Perera. ed. Ostertag.” In Proceedings of the 33rd National Convention of the American Audio Engineering Society. E. Context and consciousness: Activity theory and human-computer interaction.” Paper read at Proceedings of the 2002 New Interfaces for Musical Expression Conference. 1999. conventions and design. B. J. “’Leibhaftigkeit’ and representational theories of perception.-M. Cambridge. The Development and Practice of Electronic Music. “Live electronic music. K. Nardi.” In J. G. 1986. New York: American Audio Engineering Society. Hutchins. Hillsdale.” Interactions 6 (3):38-43. MA: MIT Press. 1996. 2004. NJ: Lawrence Erlbaum Associates. D.” Leonardo Music Journal 12:11-14. Cambridge. “Notes on cybersonics: Artificial intelligence in live musical performance. Pacherie. Nardi. 1974. Information ecologies: Using technology with heart. ———. Norman. Naturalizing phenomenology: Issues in 166 .286335. ———. Cambridge. Mumma. Draper.

Stanford. CA: Stanford University Press. Petitot. Boston University. Ph. Matching Tye. 1962. and F.. ed. J. Curran.21-30. Pfeifer. Stanford. E. 1997. 167 . E. pp. An approach to cybernetics.148-160. Boston. R. J. Prévost. New York: W. 1999. 1997. Petitot. J. Husbands and I. 1992. J. pp.-M. pp.-M. ———. Norton. W. Harvey.” In B. Philosophy. E. “Bringing digital music to life. ed. A. Pachoud. Pachoud. Naturalizing phenomenology: Issues in contemporary phenomenology and cognitive science. Varela and P. NY: Harper. Paradiso. 1997. Roy.” IEEE Spectrum 34 (12):18-30.” In F. Pask.contemporary phenomenology and cognitive science. New York: Macmillan. Fourth European conference on artificial life. ed. No sound is innocent. Pinker.” Journal of New Music Research 32 (4):345-349. ed. Preston. Petitot. pp. Pachoud... 1996. J. How the mind works. CA: Stanford University Press. Varela. ———. Prem. G. Micro man: Computers and the evolution of consciousness. CA: Stanford University Press. Verschure.” In P. and artificial intelligence. Toward a practice of autonomous systems: Proceedings of the first European conference on artificial life. 1993.196-219. Heidegger.D. G. “The teleological dimension of perceptual and motor intentionality. Essex: Copula. 2003. “Distributed adaptive control: A paradigm for designing autonomous agents. New York. J. Roy and F. Pask. MA: MIT Press. J. Representational and non-representational intentionality: Husserl. “Electronic music: New ways to play. Bourgine. B.” In B. 1999.-M.” Philosophy and Phenomenological Research 53 (1):43-69. and P. “Beyond the gap: An introduction to naturalizing phenomenology.1-80.” Computer Music Journal 20 (2):28-32. Perkis. F. Roy and F.2-9. Stanford. Naturalizing phenomenology: Issues in contemporary phenomenology and cognitive science. 1982. Pachoud. pp. “Heidegger and artificial intelligence. “Current trends in electronic music interfaces. and S. T. 1995. “Epistemic autonomy in models of living systems.. Varela. Cambridge. Varela. B. 1988. S.

The next generation. 1992. Rosenschein. J. Kaelbling. CA: William Kaufmann Inc. “Improvisation with George Lewis. “Saving intentional phenomena: Intentionality. J. representation and symbol. ———. Petitot. ———.” Contemporary Music Review 22 (4):11-22. ed.541-596. 1994.” In. “Effort and expression. Cambridge. S. “Computationalism . MA: MIT Press.-M. ed. Roy. San Francisco.1-21. CA: Stanford University Press. “Some remarks on musical instrument design at STEIM. 2002. Schloss. San Francisco. and L.” In Computationalism: New directions. Varela.” In B. 1989. ed. CA: International Computer Music Association. Gombrecht and K. J. W. Roads. T. pp.” Contemporary Music Review 6 (1):3-17. 1991.. CA: International Computer Music Association. 1985. “A situated view of representation and control. 168 . Proceedings of the 1992 International Computer Music Conference. S. C. Riethmüller. Pachoud. pp. 1999. Stanford. pp.Reddell. Strange. L.” In Proceedings of the 1989 International Computer Music Conference.-M. Cambridge. MA: MIT Press. Agre and S. “Using contemporary technology in live performance: The dilemma of the performer. Pfeiffer. “Laptopia: The spatial poetics of networked laptop performance. R.257-259. 2001. ———. 1993. Computational theories of interaction and agency.” Journal of New Music Research 32 (3):239-242. “The matter of music is sound and body-motion. Rosenschein. 2003. pp.” Journal of New Music Research 31 (2):119-129. Stanford. M.” In P.111-147.” In H. CA: Stanford University Press. P. Sapir. Roy and F. 2002. Cambridge. A. Scheutz. Rowe. Materialities of communication. MA: MIT Press. “Active music representations. pp. Los Altos. U. Machine musicianship. Ryan. J. A. Naturalizing phenomenology: Issues in contemporary phenomenology and cognitive science. pp. MA: MIT Press. Cambridge. 1996.148-156. ed. 2003.414-416. Interactive music systems: Machine listening and composing.” In A. “Gestural control of digital audio environments.

J. Scheutz. P. 1992. J. Shove.” Computer Music Journal 16 (3). Philadelphia: John Benjamins. C.” Computer Science Education 8 (2):118-129. Suchman. MA: MIT Press. Sudnow. L. A. C. C. Cambridge. “An alternative to a standard taxonomy for electronics and computer instruments. pp. Sheets-Johnstone. M. B. 2001. 1999. On the origin of objects. “Creating sustained tones with the cicada’s rapid sequential buckling mechanism. “Intentionality naturalized?” In B. W. Roy and F.. pp. L. 2001.K.” Cybernetics and Systems 30 (6):473-507. Small.23-58.: Cambridge University Press. “The foundations of computing.Shannon. and J. 1998.” In J.” In M. Cambridge. Computationalism: New directions. Cambridge. H. MA: MIT Press. Smith. Cambridge.” In. Rink. 2003. Smith. Stuart. MA: MIT Press. Cambridge. UK: Cambridge University Press. C. pp. 2002. “What we’ve swept under the rug: Radically rethinking CS1. CA: Stanford University Press. Ways of the hand: A rewritten account. “Challenging the computational metaphor: Implications for how we think. ed. “The object of performance: Aural performativity in contemporary laptop music. Musicking: The meanings of performing and listening. ———. Smyth. ed. NH: Wesleyan University Press.-M. The mathematical theory of communication. D. IL: University of Illinois Press. “Musical motion and performance: Theoretical and empirical perspectives. 2002. The primacy of movement. Stein. Urbana. L. ———. Naturalizing phenomenology: Issues in contemporary phenomenology and cognitive science. “Plans and situated actions: The problem of human-machine communication.55-83. MA: MIT Press. Petitot. 2000. The sciences of the artificial. 1996. 1999. ed.” Contemporary Music Review 22 (4):59-65. 1949.” Paper read at Proceedings of the 2002 New Interfaces for Musical Expression Conference. D. Smith. T. 1987. Spiegel.83-110. Cambridge. Stanford. Simon. 1998. 1995. A. U. Varela. Hanover. Pachoud. 169 .

———.69-99. D. 2000. “Time scale dynamics and the development of an embodied cognition. “Grounded in the world: Developmental origins of the embodied mind. ed. and L. ed.” Journal of the Acoustical Society of America 91 (6):3540-3550. and P. Cambridge. Princeton. Tschacher. pp. pp.” In R. “The dynamics of dynamics: A model of musical expression. J. J. Princeton University. ed. Stanford. B.” Journal of New Music Research 31 (2):119-129.. Music.” In W. Petitot.” In B. ———. Pessoa. “Cognition in action: The interplay of attention and bimanual coordination dynamics. Cook. J. Cambridge. pp. Proceedings of the 2002 New Interfaces for Musical Expression Conference. 2003. 1999. ed. Port and T. embodiment. Dynamical systems approach to cognition: Concepts and empirical paradigms based on self-organization. Reinventing the violin. Todd. 2000. Varela. Thelen..” In E. “BoSSA: The deconstructed violin reconstructed. 2003. Knapp.D. “Multimodal interaction in music using the electromyogram and relative position sensing. Temprado.93-132. E. ed. embodiment and coordination dynamics. MA: MIT Press. P. Mind as motion: Explorations in the dynamics of cognition. S. Brazil. E. 1999. ———.43-48. 1992. MA: MIT Press. and coordination dynamics. Pachoud.” In W. CA: Stanford University Press. Thompson. Todes. 1999. 2002.Tanaka. Cambridge..” Music Perception 17 (1):115-126. Trueman. D. pp. Van Gelder. Naturalizing phenomenology: Issues in contemporary phenomenology and cognitive science. 2001. ———. Dynamical systems approach to cognition: Concepts and empirical paradigms based on self-organization.-M. “Motion in music: A neurobiological perspective. A. Ph. Trueman. 1995.. NJ. MA: MIT Press. Roy and F.” In Proceedings of the 2000 170 . M. 1994. C. “Alternative voices for electronic sound: Spherical speakers and sensor-speaker arrays (SenSAs). Noë. MA: MIT Press. Singapore: World Scientific Publishing Company.17-44.161-195. pp. J. A dynamic systems approach to the development of cognition and action. Bahn. A. and R. “Perceptual completion: A case study in phenomenology and cognitive science. Tschacher. Body and world Cambridge. N.

Turing. Series 2 42:230-265.248-251. Albany. “Emerging frameworks for tangible user interfaces. F. “The resonance of the cubicle: Laptop performance in post-digital musics. and R. J. pp. T.. Ishii. ed. M. CA: Stanford University Press. Amsterdam: Elsevier (North Holland).” In M. during and after breakdowns. Roy and F. 2000. with an application to the Entscheidungsproblem. Ogilvy.. 1984. Addison-Wesley.” Mind LIX (236):433-460. 1998. “Designing musical cyberinstruments with body and soul in mind.. Turkle. ———. ———. J. and H. 1999. ed. 2001. Stanford. “On the choice of mappings based on geometric properties.” In B. NY: State University of New York Press. A. and P. ———.International Computer Music Conference.” In J. ed. Varela.97-109. “Making it concrete: Before. D. 2003. New York: North Holland. ———. 1936. Pachoud. 1950. pp. 2004. T. Varela. New York: Simon and Schuster.” Contemporary Music Review 22 (4):81-92. Revisioning Philosophy. Vertegaal. pp. Depalle. Turner. Ungvary. San Francisco. 171 . T. Autopoiesis: A theory of living organization. “Describing the logic of the living: The adequacy and limitations of the idea of autopoiesis. 1979. 1992. pp.” Behavioral and Brain Sciences 21:615-628. Van Gelder. Human-computer interaction in the new millenium. Naturalizing phenomenology: Issues in contemporary phenomenology and cognitive science. B. pp.579601.266-314. Principles of biological autonomy. ed. M. Wanderley. Carroll. Ullmer. Petitot. “The specious present: A neurophenomenology of time consciousness. “Computing machinery and intelligence.” In J.-M.” Journal of New Music Research 19 (3):245-255. CA: International Computer Music Association. M. The second self: Computers and the human spirit.” Proceedings of the London Mathematical Society.87-91. “The dynamical hypothesis in cognitive science. Zeleny. S. 1980.36-48. pp.” In Proceedings of the 2004 New Interfaces for Musical Expression Conference. “On computable numbers. Van Nort.

1991.. 1958. and E. “Problems and prospects for intimate musical control of computers.” Scientific American 265 (3):94104. ed. MA: MIT Press. MA: MIT Press. and E. pp. F. M. 1988. 2005. Thompson. Ubiquitous computing #1 and #2. Rosch.” In F. Toward a practice of autonomous systems: Proceedings of the first European conference on artificial life. Weiser. Paris. Brown. 1992. F. Varela. 2003. Weinberg. Palo Alto. University Paris 6. 2001. Varela and P. and S. Weiser. and T. Performer-instrument interaction: applications to gestural control of sound synthesis. CA: Xerox PARC. “The connection between AI and biology in the study of behavior.. J. “The world is not a desktop. and M. Thompson. “Sound particles and microsonic materialism. M. Whitelaw. G.. Wright. “Radical embodiment: Neural dynamics and consciousness. 1997. 2001. New Haven. Cambridge. The embodied mind: Cognitive science and human experience. 2001. Verplank. B.. Smithers. “Why interaction is more important than algorithms.” Contemporary Music Review 22 (4):93-100. Von Neumann. 2001. W. Palo Alto. Wessel. The computer and the brain. “Interconnected musical networks: Toward a theoretical framework. Wanderley. Bourgine. Cambridge. 1994. “A course on controllers.. 172 .” Communications of the ACM 40 (5):80-91.” Copmuter Music Journal 29 (2):23-39.” Interactions 1 (1):7-8. ———.Varela. “The computer for the twenty-first century. Wegner. M. CT: Yale University Press. M. E.421-428. 1991. 1996.” In Proceedings of the 2001 New Interfaces for Musical Expression Conference. The coming age of calm technology.” In Proceedings of the 2001 New Interfaces for Musical Expression Conference. CA: Xerox PARC. Webb. P. ———.” Trends in Cognitive Sciences 5 (10):418-425. D.

and H. Or. U. 2004. Prague: ACM International Conference Proceeding Series. 1994. Flores. “Body and performance. Wild. “Making motion musical: Gesture mapping strategies for interactive computer music. 1986. Gombrecht and K.261-264. MA: MIT Press. System and structure: Essays in communication and exchange. Stanford. Winkler. Norwood. P. L. pp. Zumthor.. CA: International Computer Music Association. Pfeiffer. T. 173 . 1977. Winograd.Wiener. 1961. P.” In Proceedings of the 3rd International Workshop on Task Models and Diagrams for User Interface Design. control and communication in the animal and the machine. Understanding computers and cognition: A new foundation for design.. NJ: Ablex Publishing. ed. Johnson. London: Tavistock Publications. N.17-24. Johnson.218-226. pp. “Towards a composite modelling approach for multitasking. pp. and F.” In Proceedings of the 1994 International Computer Music Conference.” In H. San Francsico. A. P. Wilden. Cambridge. 1995. CA: Stanford University Press. T. Cybernetics.

Sign up to vote on this title
UsefulNot useful