You are on page 1of 5

Disentangling Notions of Embodiment

Tom Ziemke
Dept. of Computer Science
University of Skövde, Sweden
tom@ida.his.se

Abstract pointed out that the diversity of claims in the field is not
unproblematic:
Embodiment has become an important concept in many While this general approach [of embodied
areas of cognitive science. There are, however, very cognition or embodied cognitive science] is
different notions of exactly what embodiment is and
enjoying increasingly broad support, there is in
what kind of body is required for what kind of embodied
cognition. This paper identifies and contrasts four
fact a great deal of diversity in the claims
different, increasingly restrictive notions of embodiment involved and the degree of controversy they
which can roughly be characterized as (1) ‘structural attract. If the term “embodied cognition” is to
coupling’ between agent and environment, (2) ‘physical retain meaningful use, we need to disentangle and
embodiment’, (3) ‘organismoid embodiment’, i.e. evaluate these diverse claims.
organism-like bodily form (e.g., humanoid robots), and In particular it is actually far from clear what kind of
(4) ‘organismic embodiment’ of autopoietic, living body is required for embodied cognition. Hence, while
systems. it might be agreed upon that humans are embodied
cognizers, there is little agreement on what kind of
Introduction body an artificial intelligence would have to be
The concept of embodiment has since the mid-1980s equipped with.
been used extensively in the cognitive science and AI This paper aims to identify a number of different
literature, in such terms as ‘Embodied Mind’ (e.g. notions of embodiment in the cognitive science and AI
Varela et al., 1991; Lakoff & Johnson, 1999), literature. Due to space restrictions, none of these
‘Embodied Intelligence’ (e.g. Brooks, 1991), notions is here argued for or against in particularly
‘Embodied Action’ (Varela et al., 1991), ‘Embodied much detail, although admittedly the last and most
Cognition’ (e.g. Clark, 1997), ‘Embodied AI’ (e.g. restrictive notion, which we refer to as ‘organismic
Franklin, 1997), and ‘Embodied Cognitive Science’ embodiment’, does receive some more attention than
(Pfeifer & Scheier, 1999; Clark, 1999). Furthermore, the others, since it is derived from our own earlier work
there obviously are different types and notions of (Sharkey & Ziemke, 1998, 2000, in press; Ziemke,
embodiment as can be seen in the variety of terms such 1999, 2000, 2001; Ziemke & Sharkey, in press). The
as ‘situated embodiment’ (Zlatev, 1997), ‘mechanistic rest of this paper is structured as follows: The next
embodiment’ (Sharkey & Ziemke, 2000, 2001), section briefly discusses different views of embodied
‘phenomenal embodiment’ (Sharkey & Ziemke, 2000, cognition, following the distinctions made by Wilson
2001), ‘naturalistic embodiment’ (Zlatev, 2001), ‘social (submitted). The following section then identifies a
embodiment’ (e.g. Duffy, 2001), plus in this paper number of different notions of what embodiment is and
‘physical embodiment’, ‘organismoid embodiment’, exactly what kind of body is required for embodied
and ‘organismic embodiment’. cognition. The final section presents a brief summary.
Embodiment is nowadays by many researchers
considered a conditio sine qua non for any form of Different Views of Embodied Cognition
natural or artificial intelligence. Moreover, it is one of Wilson (submitted) recently distinguished between six
the key concepts that distinguishes recent work on different views of embodied cognition, of which,
situated/embodied/embedded/interactive cognition from however, only one explicitly addresses the role of body:
the approach of classical cognitive science which, based
• “Cognition is situated”: This claim is obviously
on functionalism, had its focus on ‘disembodied’,
widely held in the literature on embodied
computation (cf., e.g., Clark, 1997; Pfeifer & Scheier,
cognition1. Wilson herself distinguished between
1999). However, while many researchers agree that
cognition has to be embodied, it is less clear so far what
1
exactly that means. Wilson (submitted) has recently It might be worth pointing out though that the concept of
situatedness itself is far from being well defined (cf., e.g.,
Ziemke, 2000, 2001).
situated cognition, which takes place “in the • ‘structural coupling’ between agent and
context of task-relevant inputs and outputs”, and environment,
“off-line cognition”, which does not. • ‘physical embodiment’,
• “Cognition is time-pressured”: That means, • ‘organismoid embodiment’, i.e. organism-like
cognition is constrained by the requirements of bodily form (e.g., humanoid robots), and
real-time interaction with the environment, e.g. • ‘organismic embodiment’ of autopoietic, living
the ‘representational bottleneck’ (e.g. Brooks, systems
1991; Clark, 1997; Pfeifer & Scheier, 1999). Each of the above notions of embodiment is in the
• “We off-load cognitive work onto the following elaborated in a separate subsection. It might
environment”: Brooks (1991) formulated this be worth pointing out beforehand that at least some of
claim by saying that “the world is its own best them are actually groups of more or less closely related
model”. A well-known example is Kirsh & notions rather than single, well-defined positions.
Maglio’s (1994) study of ‘epistemic actions’ in
the game of Tetris, i.e. decision-preparing Embodiment as ‘Structural Coupling’
movements carried out in the world, rather than The probably broadest notion of embodiment is that
in the head. systems are embodied if they are ‘structurally coupled’
• “The environment is part of the cognitive to their environment. Note that this does not necessarily
system”: An example of this view could be require a body. Franklin (1997), for example, argues:
Hutchins’s (1995) work on distributed cognition, Software systems with no body in the usual
in which, for example, the instruments in a physical sense can be intelligent. But they must be
cockpit are considered parts of the cognitive embodied in the situated sense of being
system. However, as Wilson points out, autonomous agents structurally coupled with their
“relatively few theorists appear to hold environment.
consistently to this position in its strong form”. The concept of ‘structural coupling’ originates from
• “Cognition is for action”: This claim is made, Maturana and Varela’s (1980, 1987) work on the
for example, by Franklin (1995), who argued biology of cognition, which will be discussed in further
minds to be the control structures of autonomous detail in the subsection on ‘organismic embodiment’.
agents. Inspired by this concept, Quick & Dautenhahn (1999)2
• “Off-line cognition is body-based”: According have attempted to provide a “precise definition” of
to Wilson, this claim has so far received least embodiment:
attention in the cognitive science literature, A system X is embodied in an environment E is
although “it may in fact be the best documented perturbatory channels exist between the two. That
and most powerful of the six claims”. Perhaps means, X is embodied in E if for every time t at
the most prominent example is the work of which both X and E exist, some subset of E’s
Lakoff & Johnson (1980, 1999) who have possible states with respect to X have the capacity
argued that abstract concepts are based on to perturb X’s state, and some subset of X’s
metaphors grounded in bodily possible states with respect to E have the capacity
experience/activity. This claim is discussed in to perturb E’s state.
further detail in the following section. It could be argued that this definition, which Quick &
Dautenhahn refer to as “minimal”, is of limited use to
Different Notions of Embodiment cognitive science due to the fact that it is not
As noted in the previous section, perhaps somewhat particularly restrictive. That means, it does not make a
surprisingly, many discussions/notions of embodied distinction between cognitive and non-cognitive
cognition actually pay relatively little attention to the systems, which can be illustrated with Quick &
nature and the role of the body involved (if at all). Only Dautenhahn’s (1999) example of an granite outcrop (X)
Wilson’s sixth view, of ‘off-line cognition’ as body- on the Antarctic tundra (E). The outcrop is persistently
based, explicitly mentions the body as playing a central perturbed by the wind, and in turn perturbs the air-
role. It does, however, leave open the question whether, currents’ flow. That means, it is an embodied system
for example, a humanoid robot, i.e. a robot with more according to the above definition, although certainly not
or less roughly human-like form, could have the same many cognitive scientists would actually consider this
type of cognition as its living counterpart. an example of embodied cognition.
We here would like to distinguish between the
following four notions of what kind of
body/embodiment is required for (embodied) cognition:
2
See also Quick et al. (1999).
Physical Embodiment able to address/investigate human-level cognition, has
A more restrictive notion of embodiment, which does to deal with human-like artefacts. 4
exclude the software agents Franklin (1997) considered Dreyfus (1996), for example, pointed out that “there
as embodied (cf. previous subsection), is the view that are many important ways in which neural nets differ
embodied systems need to have a physical body. Again, from embodied brains”. He argued that neural nets
this is not particularly restrictive and it still includes would need to be “put into [humanoid] robots” since the
Quick & Dautenhahn’s (1999) above granite outcrop. lack of body and environment
A somewhat more restrictive version of the notion of … puts disembodied neural-networks at a serious
physical embodiment3 is the view that embodied disadvantage when it comes to learning to cope in
systems should be connected to their environment not the human world. Nothing is more alien to our
just through physical forces, but also through sensors life-form than a network with no up/down,
and motors. From an AI perspective, Brooks (1990), for front/back orientation, no interior/exterior
example, formulated the ‘Physical Grounding distinction, … The odds against such a net being
Hypothesis’: able to generalize as we do, … are overwhelming.
Nouvelle AI is based on the physical grounding This argument is closely related to Wilson’s sixth view
hypothesis. This hypothesis states that to build a of embodied cognition (cf. previous section) and, for
system that is intelligent it is necessary to have its example, the aforementioned work of Lakoff &
representations grounded in the physical world. ... Johnson (1980, 1999) on the bodily/metaphorical basis
To build a system based on the physical of abstract concepts. If, for example, the concept of
grounding hypothesis it is necessary to connect it ‘grasping an idea’ is grounded in the bodily
to the world via a set of sensors and actuators. experience/activity of grasping physical objects, then a
robot without any gripper arm/hand could hardly be
‘Organismoid’ Embodiment expected to be able to understand that concept. A
similar argument has been presented by Keijzer (1998)
Another, yet more restrictive notion of embodiment is who has questioned the suitability of wheeled robots for
that at least certain types of organism-like cognition the study of the behavior/cognition of organisms with
might be limited to organism-like bodies, i.e. physical completely different means of locomotion.
bodies which at least to some degree have the same or
similar form and sensorimotor capacities as living Organismic Embodiment
bodies. It should be noted that the notion of
‘organismoid’ embodiment here is intended to cover The most restrictive notion of embodiment discussed in
both living organisms and their artificial counterparts. this paper holds that cognition is not only limited to
One of the simplest examples of organism-like bodies of organism-like form, but in fact to organisms,
embodiment might be the Khepera robot used by Lund i.e. living bodies. This notion has its roots in the work
et al. (1998). It was equipped with an additional of theoretical biologist von Uexküll (1928, 1982) and
auditory circuit and two microphones which had the its modern counterpart, the work of Maturana & Varela
same distance from each other as the two ‘ears’ of the (1980, 1987) on the biology of cognition, which holds,
crickets whose phonotaxis it was supposed to model. In roughly speaking, that cognition is what living systems
this case the placement of the sensors, in both cricket do in interaction with their environment.5 According to
and robot, reduced the amount of internal processing this view, there is a clear difference between living
required to respond to certain sound frequencies. Note organisms, which are autonomous and autopoietic, and
that in this case the bodies of the cricket and the man-made machines, which are heteronomous and
wheeled robot are in fact very different, except for one allopoietic (cf. Sharkey & Ziemke, 2000, in press;
crucial detail, the distance between the ‘ears’. Ziemke & Sharkey, in press).
The most prominent, and perhaps the most complex, Von Uexküll (1928), for example, argued that all
example of organismoid embodiment are humanoid action of organisms is a mapping between individual
robots such as the famous Cog (Brooks & Stein, 1993; stimuli and effects, depending on an historically created
Brooks et al., 1998), based on the argument that basis of reaction (Reaktionsbasis), i.e. a context-
research in AI and cognitive robotics, in order to be
4
Hence, ‘humanoid embodiment’ could be considered a
special case of ‘organismoid embodiment’, which might be of
particular interest to cognitive science. It should be noted,
3
This could be considered an independent notion, perhaps however, that this leaves open the question what exactly the
with the label ‘sensorimotor embodiment’. However, since it supposedly cognition-relevant bodily differences between
seems rather difficult to define exactly what ‘sensors’ and humans and other primates are.
5
‘motors’ are and how they differ from other ‘perturbatory See also Stewart (1996) who summarizes this view as
channels’, we abstain from doing so in this paper. “Cognition = Life”.
dependent behavioral disposition. Machines, on the however, even in these cases certainly remains an
other hand, at least in von Uexküll’s time (1864-1944), allopoietic machine, which is constructed centripetally,
did not have such an historical basis of reaction, which, rather than growing centrifugally (cf. Sharkey &
according to von Uexküll, can only be grown - and Ziemke, in press; Ziemke, 2000, 2001). Hence, using
there is no growth in machines. Von Uexküll further current technology, organismic embodiment is in fact
elaborated that the rules machines follow are not limited to biological living systems.
capable of change, due to the fact that machines are
fixed structures. That means, the rules that guide their Summary
operation, are not their ‘own’ but human rules, which This paper has discussed a number of diverse notions of
have been built into the machine, and therefore also can embodiment. The motivation has been similar to that of
be changed only by humans, i.e. mechanisms are Wilson (submitted), i.e. to disentangle the different
heteronomous. Machines could therefore, according to claims and notions in the field. Unlike Wilson’s paper,
von Uexküll, when they get damaged, not repair or we have here focused on different notions of
regenerate themselves. Living organisms, on the other embodiment, i.e. the question exactly what kind of body
hand, can, because they contain their functional rule is considered to be capable of embodied cognition. The
(Funktionsregel) themselves, and they have the notions we have identified in the literature are the
protoplasmic material, which the functional rule can use following:
to fix the damage autonomously. This can be
• structural coupling between agent and
summarized by saying that machines act according to
environment,
plans (their human designers’), whereas living
• physical embodiment,
organisms are acting plans (von Uexküll, 1928).
This is also closely related to what von Uexküll • ‘organismoid’ embodiment, i.e. organism-like
(1982) called the “principal difference between the bodily form, and
construction of a mechanism and a living organism”: • organismic embodiment of autopoietic, living
Every machine, a pocket watch for example, is systems
always constructed centripetally. In other words, These notions can be considered as increasingly more
the individual parts of the watch, such as its restrictive, in the sense that organismic embodiment is a
hands, springs, wheels, and cogs, must always be special case of organismoid embodiment, which in turn
produced first, so that they may be added to a is a special case of physical embodiment, which in turn
common centerpiece. is a special case of structural coupling.
In contrast, the construction of an animal, for This short paper has some obvious limitations: None
example, a triton, always starts centrifugally from of these notions has here been argued for or against in
a single cell, which first develops into a gastrula, particularly much detail. Furthermore, the most
and then into more and more new organ buds. ‘restrictive’ notion discussed here, that of organismic
The distinction between centrifugal and centripetal embodiment, does in fact apply to all living systems,
construction is closely related to Maturana & Varela’s which is not particularly restrictive at all. Humanoid
(1980, 1987) distinction between autopoietic and and human embodiment could be considered more
allopoietic systems. A living system is an autopoietic restrictive, special cases of organismoid and organismic
machine whose function it is to create and maintain the embodiment respectively. These might be considered to
unity that distinguishes it from the medium in which it be of particular interest to cognitive science, but no
exists. In the case of allopoietic machines, on the other arguments have been presented here as to why the more
hand, the components are produced by a concatenation specific cases could or should allow for substantially
of processes independent of the machine itself. different types of embodied cognition than other
As discussed in detail elsewhere (Ziemke, 2000, members of the more general categories. Nevertheless,
2001; Ziemke & Sharkey, in press), much progress has we hope that the distinctions presented here will help to
been made in the direction of self-organizing robots in disentangle the large variety of claims, notions and
recent AI and artificial life research. Unlike the theories that currently characterizes research on
machines in von Uexküll’s time, today’s adaptive embodied cognition.
robots can ‘grow’ in interaction with their environment
through the use of artificial evolutionary and learning
techniques. Furthermore, robot bodies can be evolved Acknowledgments
centrifugally over generations in some sense (e.g. The author would like to thank Noel Sharkey, Henrik
Lipson & Pollack, 2000)6. The individual robot body, Svensson, and Stefan Berglund for a number of
discussions that have contributed much to this paper.
6 The author is supported by a grant (1507/97) from the
See also Cariani (1992) for a discussion of the
Knowledge Foundation, Stockholm.
epistemological implications of adaptive robot hardware.
References Maturana, H. R. and Varela, F. J. (1987). The Tree of
Knowledge - The Biological Roots of Human
Brooks, R. A. (1990). Elephants don’t play chess.
Understanding. Boston, MA: Shambhala.
Robotics and Autonomous Systems, 6(1-2), 1-16.
Quick, T. & Dautenhahn, K. (1999). Making
Brooks, R. A. (1991). Intelligence Without Reason.
embodiment measurable. Proceedings of ‘4.
Proceedings of the Twelfth International Joint
Fachtagung der Gesellschaft für Kognitions-
Conference on Artificial Intelligence (pp. 569-595).
wissenschaft’. Bielefeld, Germany.
San Mateo, CA: Morgan Kaufmann.
Quick, T.; Dautenhahn, K.; Nehaniv, C. & Roberts, G.
Brooks, R. A. & Stein, L. A. (1993). Building Brains
(1999). On Bots and Bacteria: Ontology Independent
for Bodies (A.I. Memo No. 1439). Cambridge, MA:
Embodiment. Proc. of the Fifth European Conf. on
MIT, Artificial Intelligence Laboratory.
Artificial Life. Heidelberg: Springer.
Brooks, R. A.; Breazeal, C.; Marjanovic, M.; Scasselati,
Pfeifer, R. & Scheier, C. (1999). Understanding
B. & Williamson, M. (1998). The Cog Project:
Intelligence. Cambridge, MA: MIT Press.
Building a Humanoid Robot. In C. Nehaniv (Ed.),
Sharkey, N. E. & Ziemke, T. (1998). A consideration of
Computation for metaphors, analogy, and agents,
the biological and psychological foundations of
pages 52-87. New York: Springer.
autonomous robotics. Connection Science, 10(3-4),
Cariani, P. (1992). Some epistemological implications
361-391.
of devices which construct their own sensors and
Sharkey, N. E. & Ziemke, T. (2000). Life, Mind and
effectors. Toward a practice of autonomous systems –
Robots - The Ins and Outs of Embodied Cognition. In
Proc. of the First European Conference on Artificial
S. Wermter & R. Sun (Eds.), Hybrid Neural Systems.
Life, pages 484-493. Cambridge, MA: MIT Press.
Heidelberg, Germany: Springer Verlag.
Clark, A. (1997). Being There - Putting Brain, Body
Sharkey, N. E. & Ziemke, T. (in press). Mechanistic vs.
and World Together Again. Cambridge, MA: MIT
Phenomenal Embodiment [title might change].
Press.
Cognitive Systems Research, to appear in 2001.
Clark, A. (1999). An embodied cognitive science?
Stewart, J. (1996). Cognition = Life: Implications for
Trends in Cognitive Science, 9, 345-351.
higher-level cognition. Behavioral Processes, 35,
Dreyfus, H. L. (1996). The Current Relevance of
311-326.
Merleau-Ponty’s Phenomenology of Embodiment.
Varela, F. J.; Thompson, E. & Rosch, E. (1991). The
The Electronic Journal of Analytic Philosophy, 4.
Embodied Mind: Cognitive Science and Human
Duffy, B. R. (2001). The Social Robot. Doctoral
Experience. Cambridge, MA: MIT Press.
dissertation, Dept. of Computer Science, University
von Uexküll, J. (1928). Theoretische Biologie. Berlin:
College Dublin, Ireland.
Springer Verlag.
Franklin, S. A. (1995). Artificial Minds. Cambridge,
von Uexküll, J. (1982). The Theory of Meaning.
MA: MIT Press.
Semiotica, 42(1), 25-82.
Franklin, S. A. (1997) Autonomous agents as embodied
Wilson, M. (submitted). Six views of embodied
AI. Cybernetics and Systems, 28, 499-520.
cognition. Manuscript submitted for publication.
Hutchins, E. (1995). Cognition in the Wild. Cambridge,
University of California, Santa Cruz.
MA: MIT Press.
Ziemke, T. (1999). Rethinking Grounding. In A.
Keijzer, F. (1998). Armchair consideration about
Riegler, M. Peschl & A. von Stein (Eds.),
wheeled behavior. From Animals to Animats 5 –
Understanding Representation in the Cognitive
Proc. of the Fifth Intl. Conference on Simulation fof
Sciences, pages 177-190. Plenum Press: New York.
Adaptive Behavior (pp. 13-21). Cambridge, MA: MIT
Ziemke, T. (2000). Situated Neuro-Robotics and
Press.
Interactive Cognition. Doctoral dissertation, Dept. of
Kirsh, D., & Maglio, P. (1994). On distinguishing
Computer Science, University of Sheffield, UK.
epistemic from pragmatic action. Cognitive Science.
Ziemke, T. (2001). The Construction of ‘Reality’ in the
18, 513-549.
Robot: Constructivist Perspectives on Situated
Lakoff, G. & Johnson, M. (1980). Metaphors we live
Artificial Intelligence and Adaptive Robotics.
by. Chicago: University of Chicago Press.
Foundations of Science, 6(1), 163-233.
Lakoff, G. & Johnson, M. (1999). Philosophy in the
Ziemke, T. & Sharkey, N. E. (in press). A stroll through
flesh: The embodied mind and its challenge to
the worlds of robots and animals: Applying Jakob
western thought. New York: Basic Books.
von Uexküll’s theory of meaning to adaptive robots
Lund, H. H.; Webb, B. & Hallam, J. (1998). Physical
and artificial life. Semiotica, to appear during 2001.
and temporal scaling considerations in a robot model
Zlatev, J. (1997). Situated Embodiment. Studies in the
of cricket calling song preference. Artificial Life, 4,
Emergence of Spatial Meaning. Doctoral dissertation,
95-107.
Dept. of Linguistics, Stockholm University, Sweden.
Maturana, H. R. & Varela, F. J. (1980). Autopoiesis and
Zlatev, J. (2001). The epigenesis of meaning in human
Cognition - The Realization of the Living. Dordrecht,
beings, and possibly in robots. Minds and Machines,
The Netherlands: D. Reidel Publishing.
11, 155-195.

You might also like