Professional Documents
Culture Documents
EDEC-2001
Workshop at the International Conference on Cognitive Science ICCS2001
The objective of this workshop is to bring together researchers from cognitive science,
psychology, robotics, artificial intelligence, philosophy and related fields to discuss the
role of developmental and embodied views of cognition, and in particular, their mutual
relationship. The ultimate goal of this approach is to understand the emergence of high-
level cognition in organisms based on their interactions with the environment over
extended periods of time.
Program
13:30 – 13:50 Intro Rolf Pfeifer
13:50 – 14:25 Talk 1 Kazuo Hiraki
14:25 – 15:00 Talk 2 Max Lungarella
15:00 – 15:20 Break
15:20 – 15:55 Talk 4 Giorgio Metta
15:55 – 16:30 Talk 5 Stefan Schaal
16:30 – 17:30 Discussion and poster session with drinks to ensure a relaxed
atmosphere
Talks
Kazuo Hiraki (Department of Systems Science, University of Tokyo, Japan): Causality
and Prediction: Detection of Delayed Intermodal Contingency in Infancy
Andres Perez-Uribe and Michele Courant: A robotics framework for the study of
signal coevolution
Catholijn M. Jonker, Jacky L. Snoep, Jan Treur, Hans V. Westerhoff, and Wouter
C.A. Wijngaards: Embodied intentional dynamics of bacterial behavior
Nick Barnes: Three embodied vision-guided docking methods for mobile robots
Program Committee
Rolf Pfeifer (chair, AI Lab, University of Zurich, Switzerland)
Max Lungarella (AI Lab, University of Zurich, Switzerland)
Yasuo Kuniyoshi (Department of Mechano-Informatics, University of Tokyo, Japan)
Olaf Sporns (Department of Psychology, Indiana University, Bloomington, IN, USA)
Giorgio Metta (Humanoid Robotics Group, MIT AILab, Cambridge, MA, USA)
Giulio Sandini (LIRA-Lab, University of Genova, Italy)
Rafael Nunez (University of California, Berkeley, USA)
Causality and Prediction: Detection of Delayed Intermodal
Contingency in infancy
Abstract
The paper presents our recent project to study generated more leg activity while looking at those view s.
infant’s detection of proprioceptive-visual
contingency. As a key idea for the study, we The results of these studies can be interpreted as
introduced temporal delay between actions and
evidence of early detection of intermodal invariants.
consequent stimulation. Six -month-old infants were
presented with either an on-line (no-delay) view or a
However, a question still remains as to infant ’s temporal
delayed view of their own legs on a large screen. The sensitivity, which is crucial for defining what the
preliminary result using preferential looking method contingency is. Because there exist a lots of delayed or
suggests that six-month-old infants can discriminate lagged stimuli that are consequence of actions, we have
between no- delay and delayed views of their leg to take into account the delayed contingency for the
motion even the delay time is quite short (2 sec). further understanding of the infant’s cognition. We can
recognize the response of a computer as a consequence of
our action (i.e. clicking a mouse button) even if it is very
slow and lagged responses are employed. Bahrick and
Development of contingency detection
Watson(1985) used the prerecorded image, but it had
almost no relationship between infant’s action. In order
How do infants distinguish themselves from external to study the temporal sensitivity we have to use much
world? Detection of intermodal contingency plays an shorter delay and systematically manipulate it.
important role to distinguish the sensory consequences of
self-produced actions from externally produced sensory
stimuli. The contingency detection can be seen as a
fundamental ability for the early development of self. Delayed contingency: preliminary result
Also it can be a base for much higher-level cognitive
process such as prediction or perception of causality. As a key idea to study the infant ’s detection of
proprioceptive-visual contingency, we introduced
Recent evidence in developmental cognitive temporal delay between actions and consequent visual
psychology revealed that even young infants have stimulation to extend the previous studies.
so-called ecological self. For example, Bahrick and Six-month-old infants were presented with two views on
Watson(1985) used preferential looking method to study a large screen. One was the contingent on-line (no-delay)
self-exploration for a part of infants body (i.e. the legs). view as used in the previous studies. The other view was
They demonstrated that 5-month-olds are sensitive to also contingent but delayed two seconds from the on -line
proprioceptive-visual contingency. In their experiment, view. A special devise was used to generate the delay on
the infant was placed in front of two TV monitors, side by the image from a CCD camera capturing infant’s leg
side. On one monitor, the infants saw a contingent view motion.
of their own legs. The other monitor showed either a
noncontingent, prerecorded tape of the baby's own legs Figure 1 summarizes the looking time data for every
or a tape of another baby's leg movements. Bahrick and one-minute period during the three minutes session and
Watson(1985) showed that 5-month-olds look the total time. As the figure indicates, six-month-old
preferentially to the noncontingent view. infants looked longer at the delayed view for each period.
The result suggests that they can discriminate between
Rochat and Morgan(1995) conducted the similar the no-delay and the delayed views of their leg motion,
experiments but they systematically manipulated spatial even if the delay time is quite short (2 sec).
relationships; the temporal contingency between the 2
images and the infant's actual movements was
maintained constant. In their studies, 3- and
4-5-month-old infants looked longer at different
perspectives of their legs (left-right inversion) and
plausibility. However, questions remain as to when the
45 discrepancy between actual and predicted sensory stimuli
is compared inside the brain, and how the internal model
40
develops. For these questions, we believe that the
35 developmental study for infant’s temporal sensitively to
30 contingency provides key ideas.
25
20 on-line
delay Re ferences
15
10
Bahrick, L. E. and Watson, J.S.(1985): Detection of
5 intermodal proprioceptive-visual contingency as a
0 potential basis of self-perception in infancy.
1 3 Developmental Psychology, 21-6, 963--973.
min. min.
Figure 1. Looking time for on-line and Blakemore S-J., Frith C.D. and Wolpert, D.M.(1999):
Spatio-temporal prediction modulates the perception of
delayed view. self -produced stimuli. Journal of Cognitive Neuroscience,
11:5, 551-559.
It is worth while to note that the delayed view shown in Rochat,P. and Morgan,R.(1995): Spatial Dterminants in
our experiment were far from the noncontingent view the Perception of Self-Produced Leg Movements by 3- to
shown in Bahrick and Watson(1985) . However, it is 5-Month-Old Infants. Developmental Psychology. Vol.
difficult to conclude that infants in our study 31, No.4, 626-636.
distinguished the delayed contingency from the
noncontingency because the infants in Bahrick and Watson,J.S.(1993): Detection of self: The perfect
Watson(1985) also exhibited preference to the algorithm. In S. T. Parker, R. W. Mithcell, and M. L.
noncontingent views. In order to investigate whether or Boccia (Eds.), Self-awareness in animals and humans:
not infants can discriminate delayed view from Developmental perspectives, 131-148.
noncontingent view, we are currently conducting
additional experiments by using either a pair of delayed
and noncontingent view or a pair of on-line and longer
delayed view. Through these studies, we may get much
more convinced implication.
CONCEPTUAL INTEGRATION
Gilles Fauconnier
Department of Cognitive Science, UCSD
Cognitive science research in the last 11], and music [18]. There have been
twenty-five years has provided proposals for the mathematical and
considerable evidence that reason is computational modeling of the operation
embodied. The neural architectures that [6, 17], and experimental research has
evolved to produce perception, been carried out on the corresponding
sensation, and bodily movement are at neural and cognitive processes [1, 7].
the heart of what we experience as Additional information and an extensive
rational inference, conceptualization, bibliography can be found on the
and meaning construction. Metaphor website <blending.stanford.edu>.
theory and mental space theory played a The findings have raised
significant role in showing the fundamental research questions of
inadequacies of the abstract, algorithmic, interest to all of us, and in particular to
and disembodied views that dominated the participants of the present workshop.
structuralism, generative linguistics, and What follows is a general overview of
logic-based truth-conditional approaches the issues, results, and research program.
to semantics. Conceptual integration It deliberately avoids going into
theory, often called "blending" or CI, is technical detail. Each of the mentioned
a further development of this line of phenomena has received extensive
research. It confirms in novel ways that analysis elsewhere.
similar general properties of neural CI is a basic mental capacity that
binding and simulation lie behind leads to new meaning, global insight,
sensorimotor activities, concrete and conceptual compressions useful for
interaction with the world, human-scale memory and manipulation of otherwise
everyday experience, abstract reasoning, diffuse ranges of meaning. It plays a
and scientific or artistic invention. fundamental role in the construction of
CI is based on extensive empirical meaning in everyday life, in the arts and
observation in multiple areas of meaning sciences, in technological development,
construction. Detailed proposals have and in religious thinking. The essence of
been made regarding its constitutive and the operation is to construct a partial
governing principles, and on the match between input mental spaces and
remarkable compressions and emergent to project selectively from those inputs
dynamics they allow. A substantial into a novel 'blended' mental space,
body of research now exists on CI in which then dynamically develops
mathematics [9, 13], social science [14, emergent structure. It has been
16], literature [5, 15], linguistics [10, suggested that the capacity for complex
2
conceptual blending ("double-scope" than a clipper that sailed the same course
integration) is the crucial capacity in 1853. A sailing magazine reports:
needed for thought and language.
As we went to press, Rich Wilson and Bill
Most of our thinking, even in the
Biewenga were barely maintaining a 4.5 day lead
simplest circumstances, is unbelievably over the ghost of the clipper Northern Light,
complex but mercifully completely whose record run from San Francisco to Boston
unconscious. Our conscious experience they're trying to beat. In 1853, the clipper made
is one of direct simplicity, in the same the passage in 76 days, 8 hours. —"Great
America II," Latitude 38, volume 190, April
way that our conscious perception of
1993, page 100.
objects ("a blue cup") is one of direct
simplicity that bypasses (in
There are two distinct events in this
consciousness) what the neural circuits
story, the run by the clipper in 1853 and
of the brain are painstakingly integrating
the run by the catamaran in 1993 on
behind the scenes. CI is no exception to
(approximately) the same course. In the
the laws of backstage cognition: it too
magazine quote, the two runs are merged
operates largely behind the scenes,
into a single event, a race between the
choreographing vast networks of mental
catamaran and the clipper's "ghost". The
spaces beyond the reach of our
two distinct events correspond to two
conscious awareness, yielding cognitive
input mental spaces, which reflect salient
products at the conscious level which
aspects of each event: the voyage, the
appear straightforward and
departure and arrival points, the period
unproblematic.
and time of travel, the boat, its positions
Here are some well-known, highly
at various times. The two events share a
visible examples of conceptual blends
more schematic frame of sailing from
repeated here for illustrative purposes:
San Francisco to Boston; this is a
Boat race
"generic" space, which connects them.
A famous example of blending is
Blending consists in partially matching
"the boat race" or "regatta". A modern
the two inputs and projecting selectively
catamaran is sailing from San Francisco
from these two input spaces into a fourth
to Boston in 1993, trying to go faster
mental space, the blended space:
3
Generic space
cross-space mapping
selective projection
Blended space
In the blended space, we have two the novice to imagine that he was a
boats on the same course, that left the waiter in Paris carrying a tray with
starting point, San Francisco, on the champagne and croissants. By focusing
same day. Pattern completion allows us his attention on the tray, and trying to
to construe this situation as a race (by avoid spilling the champagne, the novice
importing the familiar background frame was able to produce something
of racing and the emotions that go with approaching the right integrated motion
it). This construal is emergent in the on the slope. The Inputs in this case are
blend. The motion of the boats is the ski situation and the restaurant
structurally constrained by the situation, with arm and body positions
mappings. Language signals the blend mapped onto each other. The Generic
explicitly in this case by using the space has only human posture and
expression "ghost-ship." By "running motion, with no particular context. But
the blend" imaginatively and in the Blend, the skier is also actually
dynamically — by unfolding the race carrying the champagne tray. The
through time — we have the relative blended space is in a way a fantasy, but
positions of the boats and their it allows a real motion to emerge.
dynamics. Rather remarkably, this pedagogical feat
Here is another attested example requires no actual champagne and
where embodied conceptual blending croissants. Just thinking of them is
leads to emergent bodily movement. A enough.
ski instructor was trying to teach a Conceptual blending with emergent
novice to hold his arms correctly and structure shows up in all areas of human
look down the slope (rather than in the behavior; cultures develop successive
direction of the skis). The instructor told blends and the ones that become
4
REFERENCES
[1] Coulson, Seana. 2001. Semantic Leaps: Frame-shifting and Conceptual Blending in Meaning
Construction. New York and Cambridge: Cambridge University Press.
[2] Fauconnier, Gilles. 1997. Mappings in Thought and Language. New York and Cambridge:
Cambridge University Press.
[3] Fauconnier, Gilles and Turner, Mark . 1998. "Conceptual Integration Networks." Cognitive Science.
Volume 22, number 2 (April-June 1998), pages 133-187.
[4] Fauconnier, Gilles and Turner, Mark. in press. The Way We Think. New York: Basic Books.
[5] Freeman, Margaret. 1997. "Grounded spaces: Deictic-self anaphors in the poetry of Emily Dickinson,"
Language and Literature, 6:1, 7-28.
[6] Goguen, Joseph. 1999. "An Introduction to Algebraic Semiotics, with Application to User Interface
Design." Computation for Metaphor, Analogy, and Agents. Edited by Chrystopher Nehaniv. Berlin:
Springer-Verlag, pages 242-291. A volume in the series Lecture Notes in Artificial Intelligence.
[7] Grush, Rick and Nili Mandelblit. 1997. "Blending in language, conceptual structure, and the cerebral
cortex." The Roman Jakobson Centennial Symposium: International Journal of Linguistics Acta Linguistica
Hafniensia Volume 29:221-237. Per Aage Brandt, Frans Gregersen, Frederik Stjernfelt, and Martin Skov,
editors. C.A. Reitzel: Copenhagen.
[8] Hutchins, E. in preparation. "Material Anchors for Conceptual Blends"
[9] Lakoff, G. and Nuñez, R. 2000. Where Mathematics Comes From. New York: Basic Books.
[10] Liddell, Scott K. 1998. "Grounded blends, gestures, and conceptual shifts." Cognitive Linguistics, 9.
[11] Mandelblit, Nili. 1997. Grammatical Blending: Creative and Schematic Aspects in Sentence
Processing and Translation. Ph.D. dissertation, UC San Diego.
[12] Oakley, Todd. 1998. "Conceptual blending, narrative discourse, and rhetoric." Cognitive Linguistics,
9-: 321-360.
[13] Robert, Adrian . 1998. "Blending in the interpretation of mathematical proofs." Discourse and
Cognition.. Edited by Jean-Pierre Koenig. Stanford: Center for the Study of Language and Information
(CSLI).
[14] Sweetser, Eve. 2000. "Blended Spaces and Performativity". Cognitive Linguistics. Vol. 11, 3/4.
[15] Turner, Mark. 1996. The Literary Mind. New York: Oxford University Press
[16] Turner, Mark. 2001. Cognitive Dimensions of Social Science. New York: Oxford University Press.
[17] Veale, Tony. 1999. "Pragmatic Forces in Metaphor Use: The Mechanics of Blend Recruitment in
Visual Metaphors." Computation for Metaphor, Analogy, and Agents. Edited by Chrystopher Nehaniv.
Berlin: Springer-Verlag, pages 37-51. A volume in the series Lecture Notes in Artificial Intelligence.
[18] Zbikowski, Lawrence. in press. The Conceptual Structure of Music. New York: Oxford University
Press.
[19] Websites for Mental Spaces and Conceptual Blending:
http://www.mentalspace.net http://www.blending.stanford.edu
Development in artificial systems
Giorgio Metta (pasa@ai.mit.edu)
AI-Lab – Massachusetts Institute of Technology, Cambridge MA, USA
Figure 1: Three shots of the experimental setup. Left: complete view of the robot; middle: close up view of the head,
note the cameras and microphones with earlobes; right: rear view of the robot and in particular of the gyros.
Sound localization
Simple vision
Saccades
Linear controller
Saccades Visuo-acoustic
Simple VOR
Head
VOR
ATNR
Reaching
Time
Figure 2 The developmental progression of the Babybot. A representation of how the various phases of the robot’s
development are interrelated. The light gray blocks are the initial configuration. See text for description of the
learning blocks.
Robots As Cognitive Tools
Information theoretic analysis of sensory-motor data
J 0 [n + 1] = J 0 [n ] + f 0[n ] ⋅ M ∑ w0 i ⋅ RH i[n ]
i= 0
N
J 1[n + 1] = J 1[n ] + f 1[n ] ⋅ ∑ w1 j ⋅ RV j[n ]
j =0
J 2 [n + 1] = J 2 [n ] + f 2 [n ] ⋅ ( w0 , 2 + w0 , 2 )
N ( N −1)
Figure 2: The camera-image is rectangular. The two
[
J 4 n + 1] = J 4 [n ] + 90 − ( J 1[n ] + J 2[n ]) one-dimensional retinas (horizontal and vertical) are
shown.
where f0 [n]=c0 (which is a constant) if J0 [n]<J0min ,
Analysis methods
f0 [n]=-c0 , if J0 [n]>J0max, and f0 [n]=f0 [n-1] else -
identical equations hold for f1 [n], f2 [n], J1 [n], and As mentioned above we are interested in the
J2 [n], respectively. w0i and w1j represent the weights information theoretic implications of embodiment.
that connect the output of the horizontal and vertical More specifically we want to investigate the use of
retinas to the motoneurons. information theoretic measures (Shannon, 1948; Cover
and Thomas, 1991; Papoulis, 1991) like the Shannon
entropy H, the Shannon or mutual information MI, and
its normalized companion MI , if applied to the sensory
channels of a situated autonomous agent. Many of the
results of information theory are in terms of these
fundamental quantities. The Shannon entropy H(X) 255=28 -1. Usually the channel capacity C is measured
accounts for the potential diversity or variability that is in bits/sec, and the rate of transmitted information
displayed by the random variable X (see appendix for a equals the entropy rate H ( X ) bits/sec. We normalize
definition), where X represents the signal from the
sensory channel. This is in principle equivalent to our everything by fs , which is the rate at which the sensors
intuitive notion of information – the more unpredictable are read out. These measures are applied to the four
a signal or an event, the more information its previously introduced sensory sub-modalities: red (R),
R+G+B
occurrence contributes. MI ( X ; Y ) is a measure that green (G), blue (B), and intensity ( I = ).
takes into account linear as well as nonlinear 3
dependencies between two time series (observations of Results
a stochastic process). This is in contrast to the better
known correlation function, which measures just linear We performed four different kinds of analyses.
dependencies. The Shannon entropy (displayed on the vertical axis) of
The application of information theory is not devoid of the red, green, and blue sensory channels of the
problems. One of the greatest problems is the huge data horizontal linear resolution retina in the case of a
requirement. To avoid it, (strong) assumptions about random movement can be seen on the left side of figure
the signals involved, and/or the noise have to be made. 4. The retina is composed of 18 color-tuned receptors
These assumptions are often unfounded and difficult to (R, G, and B-channels). The graph on the right shows
test. Nevertheless, assuming true independence of two the case of sensory -motor coordinated behavior
random processes, or normality of the signals, can (foveation on red-colored objects). In the central part of
significantly cut down the number of measurements the retina (from receptor 8 to receptor 12) the
required for the analysis. In our case no such information inflow through the R-channels is increased,
assumptions about the sensory and motor data are whereas the one in the peripheral part is decreased, if
necessary, because a sufficiently large number of data compared to the not sensory-motor coordinated case.
has been collected The number of samples per The behavior of the G-, and B-channels is reversed, i.e.,
experiment is around 3000, sufficient for the the flow through the central channels is decreased. The
computation of measures like entropy and mutual same holds for the vertical linear resolution retina,
information. where the effect is less pronounced, but still visible (not
shown here). The graphs are averages over six
A straightforward method of computing entropy and experimental runs.
mutual information is to estimate the first and second
order probability density functions - p(x) and p(x,y), The intrasensory information overlap between two R-
which is done by normalization of the 1st and 2nd order channels is shown in figure 5. Basically we set X=Ri
histograms of the time-series. The entropy (or self- and Y=Rj , where Ri and Rj are the ith , and jth R-channel
N respectively, and compute MI ( X ; Y ) . For perfectly
information) is given by H ( X ) = − ∑ p ( xi ) log p ( xi ) ,
i =1 dependent channels MI ( X ; Y ) =1, while for completely
for a discrete variable xi , which can be in N possible independent ones MI ( X ; Y ) =0. For not sensory-motor
states {x1 ,x2 ,…xN}. The mutual information is defined as
coordinated interaction, in our case the result of a
MI(X;Y)=H(X)+H(Y)-H(X;Y), where
random actuation of the joints, the information overlap
NM is minimal, there is no redundancy. Remember that
H ( X ; Y ) = − ∑ ∑ p ( x , y ) log p ( x, y) is the joint
H (X)
MI ( X ; X ) = , i.e., the redundancy is equivalent
entropy – the discrete random variable X and Y have N C
or M different states, respectively. Furthermore we to the normalized entropy or self-information, which
define the redundancy as the “capacity-normalized” explains the diagonal ridge. The case of sensory-motor
coordinated interaction is shown on the right side of
MI ( X ; Y )
mutual information MI ( X ; Y ) = , with figure 5. The information overlap is evident from the
C bumps in the center of the 3D plot. In other words, the
C = max MI ( X ; Y ) , where the maximum is taken over amount of information shared by the R-channels in the
p ( x) foveal part of the 1D-retina, is larger than the one of the
all possible densities p(x). In our case R-channels in the peripheral part. This redundancy is a
C = max H ( X ) =8bit for all sensory channels, i.e., the consequence of the interaction itself.
p( x )
Figure 6 shows the information overlap between two
sensory signals assume discrete values between 0 and
sub-modalities of the same sensor modality – color (R)
and intensity ( I ). The color channel and the intensity
channel share the same spatial location, but are tuned
to different stimulus dimensions, color and intensity,
respectively. Again, as for the Shannon entropy, if
compared to the not sensory-motor coordinated case,
the result of the sensory-motor coordination is an
increase of the mutual information in the central part of
the retina, and a decrease in the peripheral part.
The amount of variability (information) must not be
confused with the cumulated amount of activation (total Figure 6: Mutual information between R-,G-,B-
stimulation) of a particular sensor. Figure 7 illustrates channels and I-channel (intensity) for different sensory
this point. The not sensory-motor coordinated case can channels. Random movement on the left. Foveation on
be seen on the left side, whereas the sensory-motor red objects on the right.
coordinated case on the right side. Since the robot’s
task is to foveate on red objects, the peak in the
cumulated stimulation of the R-receptor is obvious –
the agent spends a lot of time foveating on red objects.
Less obvious is the fact that the maximum of the
entropy of the B-receptor for the same dataset is
slightly larger than the maximum for the R-receptor –
see figure 4. More striking is the difference for the not
sensory -motor coordinated case, where the relationship
is even reversed! The entropy of the G-, and B-channel
has an average of 5.7bit, the one of the R-channel
4.7bit. The cumulated stimulation is higher for the R- Figure 7: Cumulated stimulation of the R-,G-,B-
channel, though. receptors. The sensory-motor coordinated case is on the
right.
Discussion
The following few points are of interest. First of all, the
amount of information flowing through a particular
sensory channel is increased/decreased through an
appropriate sensory-motor coordinated interaction. An
upper bound is given by the channel capacity, i.e.,
H=C. Statistical information is maximized when all
Figure 4: The Shannon Entropy for different sensory possible sensory states of the discretized signal have
channels measured in bits. No sensory-motor equal probability of occurrence, which means that the
coordination (random) on the left, sensory-motor probability density function (pdf) is uniform. A lower
coordination (foveation on red objects) on the right. bound is given by H=0, no variability at all (constant
sensor stimulation), i.e., a pdf with a single peak.
Exploration strategies, that lead to a balance between
predictability (low entropy) and variability (high
entropy) of the sensory signal need further
investigation. In communication theory the goal is not
quite the same. Given a certain amount of noise, the
information transfer through the communication
channel has to be maximized, the more information can
be pushed through the channel, the better.
The second point of interest is that sensory-motor
Figure 5: Mutual information between receptors of the coordinated interaction leads to redundancy in the
same sensor modality. Random actuation on the left. sensory channels of the same and of different
Sensory-motor coordination on the right. modalities, i.e., to a higher mutual predictability
between them. If the two signals are totally
H ( X ) + H (Y ) − H ( X ; Y ) ecological balance mentioned earlier.
uncorrelated, MI ( X ; Y ) = =0 , and
C We may also make a step forward in describing the
the joint entropy equals the sum of the individual requirements for adaptive agents: The individual
entropies. The same measure reaches its maximum if components (dimensions in sensory and motor space)
the entropies of the individual sensory channels are must have a lot of variability, but they must also be able
high, and there is a high correlation among them (low to couple to others for specific tasks. This is precisely
joint entropy) – one sensory channel can be used as a what complex systems are about.
predictor for the other one. This redundancy is clearly The importance of an appropriate sensory-motor
not present in the case of not sensory -motor coordinated coordinated interaction cannot be overestimated, since,
interaction. The mutual predictability in this case is as the results show, it can lead to a regularization of the
much lower – see figure 5. raw sensory stimulation – see also (Pfeifer and Scheier,
Human infants exhibit a wide range of exploration 1999) and (Lungarella et al., 2001) - which in turns
strategies: mouthing, banging, fingering, scratching, should speed up and simplify learning. We hypothesize
squeezing, waving, and listening (Kellman and that the self-information (entropy) of the central and
Arterberry, 1998). When objects are placed in the peripheral receptors largely depends on the type of
mouth, infants are able to detect surface properties interaction, but is less dependent on the environment. In
(Meltzoff and Borton, 1979), and object characteristics a similar way eye-movements, and certain particular
such as rigidity (Rochat, 1987). In other words, there hand movements have task-dependent characters.
are different actions related to the exploration of Many additional experiments need to be performed to
different object properties, in the sense, that they confirm our hypotheses, though. Other sensory
provide the sensory input, which is “optimal and modalities (touch, audition) have to be taken into
sometimes necessary”, for extracting the desired account. They would shed some light on how infant
information. As shown in this simple case study, the discover intermodal relationships, and how the
information inflow through the various sensory existence of multiple sensory channels allows us to
channels very much depends on the action itself, i.e., an learn more about, and function within, the world.
appropriate choice of it seems to be of importance for Different sensor morphologies and other task
the simplification of the subsequent neural processing. environments have to be tested and they’re effect on the
Since entropy is an information theoretic measure that information inflow need to be explored.
captures the variability of the (sensory and motor) The next step will then be to exploit the data for
signals, it tells us something about the complexity of learning. Using similar kinds of analyses, we hope that
the interaction itself. In an agent context, the more we will be able to derive the conditions under which
diverse the agent’s behavior, the more variation in the agents can learn rapidly new categorization behavior
sensory channels, and the higher the entropy in the while maintaining the stability of existing ones.
sensory system. Since we are interested not only in
complexity due to sensory stimulation, but also in
complexity due to self-generated sensory stimulation, Acknowledgments
we need to take into account aspects of the motor
system’s variability. A good measure to start with, is We would like to thank Lukas Lichtensteiger, who
the ratio between the total entropy of the motor signals provided us with valuable suggestions and helpful last-
x=(x1 ,…, xS ) and the total entropy of the sensory signals minute comments. Furthermore we are indebted to
Fumiya Iida for the many discussions, which helped to
Hmotor ( x )
y=(y1 ,…,yT), we define it as B = . There shape the ideas exposed in this paper. This research has
Hsensor( y ) been supported in part by grant #11-57267.99 of the
should be a match in the variability in the sensory Swiss National Science Foundation.
channels and of the motor outputs. In other words, B
measures how well-balanced the motor and sensor
signals are. Information Theoretic Appendix
Some useful definitions:
Conclusion and future work - Random variable, RV: variable that assumes a
With the ideas exposed in this paper, it may eventually numerical value for each random outcome of an event
be possible to more formally describe some of the or experiment
design principles of autonomous agents (Pfeifer, 1996; - Probability density function of a RV X, p(x):
Pfeifer and Scheier, 1999), e.g., the principle of normalized 1st -order histogram of a RV.
- Joint probability density function of RV X and Y, Itti, L., Koch, C., and Niebur, E. (1998). A model of
p(x,y): normalized 2nd -order histogram of 2 RVs. saliency-based visual-attention for rapid scene
- Shannon entropy H(X): measures the randomness of analysis. IEEE Transactions on Pattern Analysis and
a RV. The more random a RV, the more entropy does Machine Intelligence, 20(11), 1254-1259.
it have. Intuitively it is a measure of (the logarithm Kellman, P.J., and Arterberry, M.E. (1998). The Cradle
of) the number of states the RV could be in. of Knowledge. Cambridge, MA: MIT Press. A
- Joint entropy H(X;Y): Measures the uncertainty Bradford Book.
about both X and Y. Kruschke, J.K. (1992). ALCOVE: An exemplar-based
- Mutual Information MI(X;Y)=H(X)+H(Y)-H(X;Y): connectionist model of category learning.
Psychological Review, 99, 22-44.
Measures the portion of entropy shared by X and Y.
It is high if both X and Y have high entropy (high Kuniyoshi, Y., and Berthouze, L. (1998). Neural
variance), and share a large fraction of it (high learning of embodied interaction dynamics. Neural
covariance). It is zero, if X and Y are statistically Networks, 11, 1259-1276.
independent. In other words, MI is a measure of the Metta, G. (1999). Babyrobot: A Study on Sensori-motor
deviation from statistical independence.
Development. In Dipartimento di Informatica,
- Channel capacity C= max H ( X ) : In our case the Sistemica e Telematica. University of Genoa (PhD
p ( x) Thesis). Genova. Italy.
maximal amount of statistical information that can be Lederman, S.J., and Klatzky, R.L. (1990). Haptic
transferred through a sensory channel in a certain
exploration and object representation. In M. Goodale
instant. It is computed over all possible sensory signal (ed.), Vision and Action: The Control of Grasping.
distributions p(x).
pp. 98-109. New Jersey: Ablex.
- Redundancy MI ( X ; Y ) = MI ( X ; Y ) / C : The Lee, T.S., and Yu S.X. (1999). An information-
capacity- normalized mutual information. theoretic framework for understanding saccadic
behaviors. In Solla, S.A., and Leen, T.K. (Eds.).
Since the logarithm in base 2 is used, H and MI are Proc. of the 1 st Conf. on Neural Information
measure in bits. Both, entropy and mutual information Processing. K-R. Muller. MIT Press.
are used in a statistical connotation; they can be thought
of as multivariate generalizations of variance and Lungarella M., te Boekhorst R., and Pfeifer, R. (2001).
covariance (univariate statistics) that are sensitive to Dimensionality reduction through sensory-motor
both linear and nonlinear interactions. coordinated interaction. In preparation.
Papoulis, A. (1991). Probability, Random Variables,
and Stochastic Processes. McGraw-Hill Book
References Company, NY, USA.
Asada, M., MacDorman, K.F., Ishiguro,H., and Pfeifer, R. (1996). Building “Fungus Eaters”: Design
Kuniyoshi, Y. (2000). Cognitive developmenal principles of autonomous agents. In P.Maes,
robotics as a new paradigm for the design of M.Mataric, J.-A.Meyer, J.Pollack, and S.W.Wilson
humanoid robots. Proc. Of the 1st IEEE-RAS Int. (Eds.). From animals to animats: Proc. of the 4th Int.
Conf. on Humanoid Robotics, MIT, Cambridge, MA. Conf. on Simulation and Adaptive Behavior.
Braitenberg, V. (1984). Vehicles: An Experiment in Cambridge, MA: MIT Press. A Bradford Book.
Synthetic Methodology. Cambridge, MA: MIT Press. Pfeifer, R., and Scheier, C. (1997). Sensory-motor
Brooks, A.R., Breazeal, C., Marjanovic,M., Scassellatti, coordination: the metaphor and beyond. Robotics and
B., and Williamson, M. (1999). Alternate essences of Autonomous Systems, 20, 157-178.
intelligence. AAAI-98. Pfeifer, R., and Scheier, C. (1998). Representation in
Bushnell, E.M., and Boudreau, J.P. (1993). Motor natural and artificial agents: a perspective from
development in the mind: The potential role of motor embodied cognitive science. Zeitschr. f.
abilities as a determinant of aspects of perceptual Naturforschung, 53c, 480-503 (invited paper).
development. Child Development, 64, 1005-1021. Pfeifer, R., and Scheier, C. (1999). Understanding
Cover, T.M., and Thomas, J.A. (1991). Elements of Intelligence. Cambridge, MA: MIT Press.
Information Theory. New York: Wiley. Rochat, P. (1987). Mouthing and grasping in neonates.
Edelman, G.E. (1987). Neural Darwinism: The Theory Evidence for the early detection of what hard and soft
of Neural Group Selection. New York: Basic Books. substances afford for action. Infant Behavior and
Development, 25, 871-884.
Scheier, C., and Pfeifer, R. (1997). Information
theoretic implications of embodiment for neural
network learning. Proc. ICANN 97, 691-696.
Scheier, C., Pfeifer, R., and Kuniyoshi, Y. (1998).
Embedded neural networks: exploiting constraints.
Neural Networks, 11, 1551-1569 (invited paper).
Shannon, C.E., (1948). A mathematical theory of
communicaton. Bell Syst. Tech. J., 27, 379-423, 623-
653.
Thelen, E., and Smith, L. (1994). A Dynamic Systems
Approach to the Development of Cognition and
Action. Cambridge, MA: MIT Press. A Bradford
Book.
Yarbus, A.L. (1967). Eye Movements and Vision. New
York: Plenum Press.
The Constitution of Spatiality in Relation to the Lived Body :
a Study based on Prosthetic Perception
Charles Lenay (charles.lenay@utc.fr),
COSTECH, Université de Technologie de Compiègne,
Centre P.Guillaumat, BP 60319 – 60206 Compiègne Cédex France
Conclusion
These experiments show clearly that there is no spatial
350 300-350
perception without embodied action, no depth without a
300 250-300 spatial dimension of the lived body which actually
Angle β (degree)
1,26
20
40
1,14
60
Angle α (degree) the perceiving subject. For example, the white cane of
Distance L
the blind person is an instrument of coupling which
renders accessible a tactile perception at the end of the
cane (i.e. where there are no nerve fibres at all). From
the moment that the tool is integrated in the perceptive
activity, it becomes "transparent" (i.e. it disappears
from consciousness). To take another everyday
Figure 2. The curve expressing the angle β as a example, an experienced driver "becomes one" with his
function of the angle α for values of L ranging from car; he perceives the road-surface under "his" tires; it is
1.02 to 1.42 (the length of the arm, b, is taken as unity). "he" (and not the car) which grazes the kerb or runs
over some rough gravel. The driver is not conscious of
In another series of experiments, we constrained the the vibrations in the car seat (as such), nor even of the
range of possible actions even further by requesting the relations between his proximal actions (turning the
subject to block the wrist movements, thus keeping the steering-wheel, stepping on the brake) and his
finger in a straight line with the arm. When the only sensations (visual or tactile); his attention is focussed
movements allowed were rotation of the arm from the on the outside, and on the events which occur in the
shoulder joint (i.e. variation in α but not β), the subject "world" in which he is acting. We may conclude by
was still able to indicate the direction of the target, but saying that technical artefacts, by transforming our
depth perception was lost. Similarly, if movements at modes of acting and sensing, transform our "lived
the wrist were allowed, but the arm was fixed (ie bodies" and are thus constitutive of our way of being in
variation in β but not α), the subject was again able to the world.
indicate direction, but depth perception was again
absent. In order for depth perception to arise, it was Acknowledgments
necessary that the wrist, with a possibility of action We would like to thank Clotide Vanhoutte and
(variation in β), should itself be able to move in relation Catherine Marque for their valuable help in desinging
to the target. This requires, specifically, a concrete the equipment and carrying out the experiments
dimension of the arm which sweeps over the space. reported in this paper.
This is well expressed by equations (1) and (2), which
include the parameter b, the length of the arm. This is References
not to say that the subject performs a trigonometrical
Bach-y-Rita, P. (1982). Sensory substitution in
computation based on an explicit value for b. In fact,
rehabilitation. In Rehabilitation of the Neurological
one might just as well say that the length of the arm is
Patient, L. Illis, M. Sedgwick & H. Granville (eds.);
constituted as a function of the distance of the target.
Oxford: Blackwell Scientific Publications, pp. 361-
This could be expressed by rewriting equation (1) in a
383.
mathematically equivalent form :
Bach y Rita, P. (1994). Sensory substitution, volume
transmission and rehabilitation: emerging concepts.
b = (sin α - cos α tan(α+β)) / L Equation (3) In L. S. Illis (Ed.), Neurological Rehabilitation, (2nd
ed., pp. 457-468). Oxford: Blackwell.
The subject, as an organism in movement, belongs to
the space in which he is situated with respect to the
Collins, C.C. and Bach y Rita, P. (1973). Transmission Van Gelder (Eds.) Mind as Motion. Cambridge, MA:
of Pictorial Information Through the Skin, Advan. MIT Press.
Biologic. Med. Phys., 14, 285-315. Varela, F. (1979). Principles of Biological Autonomy,
Edelman, G. (1987). Neural Darwinism: The Theory of New York: Elsevier.
Neuronal Group Selection. Basic Books, New York,
1987.
Gibson, J.J. (1966). The senses considered as
perceptual systems. Boston : Houghton Mifflin.
Gibson, J.J. (1986). The ecological approach to visual
perception. Hillsdale, Nj: Erlbaum.
Guarniero, G. (1977). Tactile vision : a personal view.
Viz. Impairment and Blindness, march, 125-130.
Hanneton S., Gapenne O., Genouel C., Lenay C.,
Marque C. (1999). Dynamics of Shape Recognition
Through a Minimal Visuo-Tactile Sensory
Substitution Interface, Third Int. Conf. On Cognitive
and Neural Systems, Mai 1999, Boston, USA, pp.
26-29.
Hardy, B., Ramanantsoa, M.M., Hanneton, S., Lenay,
C., Gapenne, O. and Marque, C. (2000). Cognitive
processes involved in the utilisation of a simple
visuo-tactile sensory prothesis, ISAC'00, Exeter,
England, pp. 52-55.
Kaczmarek, K.A. and Bach y Rita, P. (1995). Tactile
Displays, in Virtual Environments and Advanced
Interface Design, Barfield, W. and Furness III, T.A.
(Eds.), New York : Oxford, 349-414.
Lenay C., Cannu S., Villon P. (1997). Technology and
Perception : the Contribution of Sensory Substitution
Systems. In Second International Conference on
Cognitive Technology, Aizu, Japan, Los Alamitos:
IEEE, pp. 44-53.
Lenay, C., Hanneton, S. & Gapenne, O. (2000). Virtual
space and perception substitution. In Proceedings of
the 4th World Multiconference on Systemics,
Cybernetics and Informatics (SCI 2000), IIIE, Vol.
III (Virtual Engineering and Emergent Computing),
Orlando Floride (USA), pp. 674-677 (selected as the
best paper presented in the session Virtual
Engineering II).
Marque C., Gapenne O., Lenay C.,. Vanhoutte C.,
Stewart J. (2000). The Visual Glove: 2D form
recognition by means of supply technology, in
Proceedings of the Sixth International Conference on
Tactile Aids, Hearing Aids and Cochlear Implants
(ISAC'00), Exeter, Angleterre, pp. 87-90.
O’Regan, J. K. and Noë, A. (2000). A sensorimotor
account of vision and visual consciousness,
Behavioral and Brain Sciences, 24(5), 2000
Paillard J. (Ed.), (1991). Brain and space, Oxford :
Oxford University Press.
Poincaré H. (1905). La valeur de la science, Paris,
Flammarion.
Poincaré H. (1907). La Science et l’hypothèse, Paris,
Flammarion.
Turvey, M. T. and Carello, C. (1995). Some Dynamical
Themes in Perception and Action. In R.F. Port & T.
! " # %$ $ %
&')( *(+(+'
,
E=CD
¡£¢m¤¥p¦ ¡¨§¦ ¡¨¢§ ¦ ©;ª;¢«; ¬£m¡£¢;§¯®&°g©;¢;¡£¥ ®&¥7±²
³x´¶µP·¹¸Gº·¹»¼¾½xº¹º¿·¿À|½Á»|µP½ ¦ ©;¦Âᣬ¨¬4¬¨¬£±e ±Äű¦ ¥¦ ±&°±®&®ª;Æ
¢;¡£°¦ ÇÂᣦ¾©*©mª®&È¢*ÄÅÉ¡¨¢§È¥¢;«*Ê; ¦ ¡£°¡£Ê;¦¾
¡£¢©mª®&È¢
¥p±Ë°É¡¨¬ °¦ ¡¨¤!¡¨¦ ¡£¥Ì_Íc¢Î¦ ©;¡£¥ÏÊ;ÊÅÉÐXÂÑ«;É¥ °p¡¨ÄR*¢±¢;Æ
¦ ±§ÈÉ¢;¦ ¡£°¯®&±m«;¬Ñ±²¦ ©;¯¥p±Ë°É¡¨¬Ñ¡£¢m¦ ¬£¬¨¡£§¢;°ÉÈÐ5Âé;¡£°y©¡£¥
ÄÅÉ¡¨¢§Î¡£®&Ê;¬£®&¢!¦¾É« ±¢ ±ª; ¡£¢;²¹¢m¦* ±Äű¦ÐÒ »¶Óx¸»´·¹Ô Ì
Õ ©;Ö ±Äű¦7¡£¥7°ª ¢m¦ ¬£×°Ê;Ĭ¨±²:¦ ¦¾É¢m¦ ¡¨±¢;¬2¡£¢m¦ Æ
°¦ ¡¨±¢]Âᣦ¾©]©mª®&È¢]° §¡£¤ ¥ÉØ|ÂÑ, Ï«;É¥ ¡£§¢;¡£¢;§¦ ©;
±Äű¦ÙÚ¥:¡£¢;¢;¦ °Ê;Ä¡¨¬£¡¨¦ ¡£¥:¦ ©;¦5Âᣬ¨¬¢ÈĬ¨¦ ©; ±Äű¦
¦ ± °Ûmª;¡£ Ρ£¢m¦ ¢m¦ ¡£±¢;¬¨¡£¦ Üݧ±È¬£ÆÞ«;¡£ °É¦¾É« ÄÅÉ©;e¤m¡¨± ¥pß
¦ ©; ±ª;§© àmʬ¨± ¦ ¡¨±¢ ±²Ç¦ ©;Ê;©!m¥ ¡£°¬¢;« ¥p±Ë°É¡¨¬¢Æ
¤m¡£ ±¢;®&¢!¦Ð:¢;« ¦ ±ª;¢«; ¥p¦ È¢«®&ÊȦ ©;ɦ ¡¨°¬£¬£±¦ ©;Æ
¥ÉÙmÄÅÉ©;e¤m¡¨±
Ämᥠ°É ¡¨Ä¡¨¢§¦ ±¦ ©;â¡£¢m¦ ¢m¦ ¡£±¢ ¦ ©;¦ÑÄÅÉ¥ ¦
àmʬ¨¡£¢;¥7¦ ©;ÄÅÉ©;e¤m¡¨±Ìâãâ¥7¦ ©;Ê; É Ûmª¡¨¥p¡¨¦ ¥7²¹±â¦ ©;¡£¥
®&ÊȦ ©;ɦ ¡¨°¥ °É ¡¨Ê¦¾¡£±¢WÐXÂÑ&¥ ¥pª;®&*¦ ©;¦ä ´G·å»¼
¸G¼Þ¼¾½x»æ
a ^Cngk 9 ú1;<;úùþ fgoCk^CCtkln w fgZ\ngigZ)ah`8¾d+`|f,ngZ\stZ\f
=
reinforcement
a ^CngkyLz ]7_b6FË^Cahi'ahfgahZ\`rZah`|fgke`AfgahZ\`d+÷hahf u
=
^d k ahi`CZ\f*d¯÷~d+stkl÷:ZÑngk*Þkengkl`Afm|sC^Cf*d¯Cahk!b'k*Z
oCa o w
9 1 ú ;<;úùþ od+i d ÞZGø\k!d+fyk!} igfgklngklZ øAahigahZ\` oCk!d\})d+i tZ\fgke`Afga~d+÷&ah8` ÞZ\ng[Îd+fyahZ\` ÞngZ\[ ,oCa~bpo fgoCk ngk!b'kea¹ø\kln }k w
igoW1Z ,` ah` =
a ^Cngk v 3iÏd\bpo Z fgoCk)klu\kli od+i¯f âZ b'Z\÷hZ\n ngahø\klifgoCk,igkl`5}kl
n iah`|fgke`AfgahZ\` fgZ[d+`W,a kli'fÃigZ[]klfyoWah` ah`
Y, Y " bld+[]klned+i ÷hah{\k fgoCZ\igk Z jÃùÉü >7¯}Cd+[)imUklf d+÷ h fgoCkkl`|øAahngZ\`C[]kl`| f Z\nâah`fgoCk[]kl[)Z\ngSu igZd+iâfgZ bpod+` k
v\\\ 6k]fgoCk ÷hBZ âkln Z\`Ck od+i d ,a~}k d+` ÷hk ÷hkl`Ci fgod+f fgoCk ngk!b'kea¹ø\kl
n i d+fgfykl`AfgahZ\` d+`} +Z\n stklod¶øAahZ\ned+÷¯}ahi'KZ\i'a w
ig5d+`Ci fgoCk øAahig^dÈZ ÷ +Ckl÷~} d+stZ\^Cf v\ }k ngklkli oCZ\ng,a _lZ\` w fgahZ\` tklngstkln ` X ah÷higZ\`Ç+ 6 }%oCahi*kl`d+sC÷hkli ^Ci*fgZ
fed+÷h÷hSu e:d+`}ÎfgoCkU^CCtkln&Z\`CkUod+i×d×fgke÷hklWoCZ\fyZ÷hkl`Cifgod+f Cngk!}a~b'f&d+`}rb'Z\`|fgngZ\÷ Z\fgoCklng
i stklod¶øAahZ\E n ÞZ\n,*k b'ahkl`|fb'Z w
fed+{\klid b'÷hZ\igk w ^C ah[d kZ\` fgoCl k ÞZGø\k!d d+stZ\^Cf v\ }k w Z\tklned+fya¹Z\` d+`} b'Z\[]tklfyahfgahZ\` ,ahfgo Z\fgoCklng
i |%oCkngZ\stZ\fm
ngklkliÎoCZ\ng,a _lZ\`|fed+÷h÷hSu 6 %oCngklk []Z\fgZ\ngi }ngahø\k fgoCk klu\klim fgoCklng6k ÞZ\ngkRGod+ifgZ&÷hk!d+ng`]fgoCk,^Ci'k Z tiguA[_stZ\÷hifgZUb'Z\[][_^ w
b'Z\`|fgngZ\÷h÷hah` fgoCklahn}ahngk!b'fgahZ\` d+` d+`} b'Z\[][)Z\` fgah÷hmf 6 `Ca~bld+fgkah`|fgke`AfgahZ\`C
i z BZ bld+`fgoCkngZ\stZ\f}klngahø\kah`|fed+` a w
%oCk []Z\fgZ\ngi d+÷hi'Z oCkl÷hCifgoCk klu\klifgZ tklnn ÞZ\ng[ d iedb w sC÷hk*ah`|fgke`AfgahZ\`CE i ÞngZ\[ Z\sCi'klngød+sC÷hkstklod¶øAahZ\nÏZ Z\fgoCklng6i
bld\}kZ ZGø\klo n g a }k ngklklc i ,ahfgoCah` \ []i'k!bm5d+o i âkl÷h÷âd+i %oCk%ngZ\stZ\t f ,ah÷h÷Çstkd+sC÷hk%fgZ^C`};klngigfed+`} Z\fgoCklng
i tah` w
ig[)Z|Z\fgo C^Cn'ig^CahfZ âøAahig^dÈ÷fed+n kefgi [%oCk)ah[d kep i ÞngZ\[ fgke`AfgahZ\`_s|u[]k!d+`CiZ Ca~}kl`|fg,a +bld+fyahZ\ ` ,ahfgo¯fgoCkâah`|fgkeng÷¹ZKb'^ w
fgoCk bld+[]klned+i d+ngk Þk!} ah`|fgZ [d+i'iga¹ømkl÷hu d+ned+÷h÷¹ke÷&ah[d k fgZ\
n %XZ }Z igZ
fgoCkÎngZ\stZ\f]od+i)fgZ stk d+` þÞúWûcýúWûcþÞù;ú
CngZKb'kligi'Z\ngi >rq47ts uÖahigahZ\ ` cÞZ\3n ¾d\b'a~d+÷Öd+`} `CZ\` w ¾d\bgacd+÷ /ýÈþÞúü bld+d+sC÷hk Z Zd+÷ w }ahngk!b'fyk!}igtZ\`|fed+`CklZ\^Ci7stklod¶øAahZ\n
Þk!d+fy^Cngk fgned\bp{Aah` B ,oCa~bpo]kl`d+sC÷hkliÃngk!d+÷ w fgah[]kád+fgfykl`AfgahZ\`d+÷ s|urahfgigkl,÷ < v ` Z\ne}klnÖfgZd\*b F^Wahngk_ah`|fgke`AfgahZ\`d+÷hahf uRdngZ\stZ\f
ah`|fgkened\b'fgahZ\` ,ahfgo fgoCkah`|fgkeng÷¹ZKb'^CfgZ\nd+`} ,ahfgo d fgoCahne} igoWZ\^C÷~} tZ\i'igkligiÏfgoC| k ÞZ\÷h÷hBZ ,ah` y
i Z _lah[d v\\ ? 6D
Z\s I¾k!bg
f v ` d\}C};ahfyahZ\`RfgoCk_oCk!d\}od+iÖ÷hahCO i ,ahfgo v "o^E=
i
,oCa~bpo d+÷h÷hBZ fgoCk []Z\^Wfyo fgZ Z\tkl` d+`} ig[)ah÷hk ÞZ\ n ¾d w 7 ýÈú ù ¶þ 2Uùûù r
ûý ,ahfgo ,oCa~bpo fgoCk
b'a~d+÷Ã*k )|Wngkligiga¹Z\`Ci]d+`}÷hah w iguA`b oCah` ,ahfgo ø\ZKbld+÷h,a _!d+fya¹Z\w ` ngZ\stZ\fÖbld+`^Cfgah÷h,a _lkáfgoCk?d tZ\ne}Cd+`b'káfgod+f kl[]kln kli
iÏd\bpo "o^E= ahi b'Z\`|fgngZ\÷h÷hk!} s|u ah`|fgkeneb'Z\`C`Wk!bgfgk!} q W Y xÖ
i k stklf âklkl`ÎfgoCkngZ\stZ\f&d+`} fgoCkkl`|øAahngZ\`C[]kl`|
f
oCa o w ÷hklø!kl÷Xigkl`WigZ\nga w []Z\fgZ\n7ah8` ÞZ\ng[Îd+fyahZ\`ahi CngZKb'kligi'km}Us|u 7 Áý Wý ¶ûùþ Áý'ù ;\/ý ÈþÞù ,oCZ\i'kah`Cahfga~d+÷b'Z\` w
d]b'÷h^Ci'fyklnÏZ q ah`|8^ ) sÇY
i fgke`Afgid+ngkah`C`5d+fykng*k KC*k )Akei!W
k ned+i'Cah` ,od+f w
9 1 ú ;<;úùþ od+i stklkl` k F^WahCtk!} ,ahfgo fgoCk ÞZ\÷h÷hBZ ,ah` klø\kln,fgoCkod+`5})fgZ\^bpoCkli
Þ^C`b'fgahZ\`C
i D g? _fgned\bp{Aah` d+` `CZ\`WigKk!b',a +b o|^C[Îd+` ¾d\b'k 7 igklf×Z 0¶þ È1ý ÉÑ÷hah{\ko|^C` ken×Z\y n ¾d+fya ^WkRfgod+f
¨«£r£§
>z'
'
*ª
*1 ¢¡m£¤ ªr¤Ä£rËÆÍÌ£¤ È¢É>£ªÊ ªr¤Ä£rËÆÍÌ£¤ È¢É>£ªÊ
¬ *ª ¢ ¬ ¨«£r£§®
¤ÇÈ ¤ÇÈ
¥¦£§©¨
¬¾ ¿ªr*À ÂÁäģoÅÃÆ®¤Ç£1ª ¢Æ© ¬Î bƮţm ¢Æ©¨«Ïd ¢¤ÄË1£r
a ^Cngk|gf ]iÇ[]5d+fyo|u4ÞZ\n&d+`CZ\fgoCkln
iÏstklod¶øAahZ\n
=
a ^Cngk
= a ÎYngk!d+fyah` ¾I Z\ah`|fád+fgfykl`AfgahZ\`l,ahfgo d×bld+ngk ahø\kln
Z\`C
k i¯BZ ,` stZK}
u %XZ klfyoWket n ,ahfgM o I¾Z\ah`|f)d+fgfykl`AfgahZ\`
2ngk w C÷~d¶±u W,ahfgo bld+ngk ahø\klng
i
KC* k )Aahø\kah[]a¹f d+fgahZ\` kl`d+sC÷hkliUfgoCkngZ\stZ\ffgZ bld+Cfg^CngkoCBZ ` d\}C};ahfyahZ\` ng*k KC*k )Aahø\k ah[]a¹f d+fgahZ\` Z E¾d\b'a~d+÷ kligfg^Cngkli
igZ[]klZ\`Ck kl÷higk ;ýý ¯d+`} ;ÿÈû %oCahi ahi þÞú ;þ ÁýÿÈûrý ¹2 oCkl÷hCi_ah8` ¾d+`|fgi%d+`} bld+ngk ahø\klngi_fgZigo5d+ngk%kl[]Z\fgahZ\`d+÷Ïd+i w
W ý ¶þÞýúWÿýR:ah` ,oCa~bpo fgoCkngZ\stZ\f_ah`};ahngk!bgfg÷hu *k )|Xklngahkl`b'kli tk!b'fyi Z fgoCk ah`|fgkened\b'fgahZ\` ² ngk!?d _lk!d+÷ ` Cbld+i'igkl÷h÷~d+fgaÁ
igZ[]klZ\`CkÇkl÷hig
k istklod¶øAahZ\J n ÙÚ klklÜ÷ Û¶d\b''f Ý2s|u&fgned+`Cig÷~dÈfyah` ah`|fgZ v\\\ 6 Ð []ahfed+fgahZ\` Z оd\b'a~d+÷t*k )|Wngkligiga¹Z\`Cim!÷hah{\kÖfgoCZ\igk_bld+`
fgoCk_ngZ\stZ\
f iBZ ,` d\bgfg^d+÷Z\nøAahngfy^5d+Þ÷ 7stklod¶øAahZ\y n ÙrÞklke÷ ¶ stkCngZK}^b'k!}s|t u 9 1ú ;<;úùþ d+i igoW1Z ,` ah ` =
a ^Cngk $ m âZ\^C÷~}
d\bg6 f ÝÈWd+i,CngkløAahZ\^Ci'÷hu igoW1Z ,`ahh ` =
a ^Cng| k gf ah`};^b'këiga¹[]ah÷~d+nkl[]Z\fgahZ\`×ah`×fgoCk ah8` ¾d+`|fg
i &%XZ klfyoWkeb n ,ahfgo
7 `|^C[_stkln7Z 5ngkligk!d+nebpoCklngiâod¶ø\kÖig^ \ kligfgkm}fgod+f
tklZ w I¾Z\ah`|fd+fgfykl`AfgahZ\`
Ëah8 ` ¾d+`|fgid+`}bld+ngk ahø\klng i âZ\^C÷~}stk d+sC÷hk
C÷hk d+ngkah`C`5d+fykl÷hu k F^WahCtk!} ,ahfgo fgoCk d+sCah÷¹ahf¾u fgZ bld+ w fgZÖigo5d+ngk kl[]Z\fgahZ\`Ci
fgBZ d+ne}ifgoCklah n I¾Z\ah`|fg÷hu)d+fgfykl`}k!}áfed+n w
fg^Cngk d+`CZ\fgoCkln tklngigZ\& ` i d\b'fya¹Z\`C
i k igZ[]k Z fgoCk klø|a w klfyi k!fgoCahv i âZ\^C÷~}Z Þfykl`stkZ\sCi'klngø!k!}ah` fgoCZ k ÞZ\ng[ Z ù 2
}kl`b'kliífgoCklu b'ahfyk!} d+ngkrúWýù;ú ;û YUþ þÞ1ÿ ßÈþÞú;ü #qkl÷hÂf _lZ ÿÈþ &Áý ;1ý pýÈúÿÈþÞú;üÑ ,oCklngk×ah8` ¾d+`|fgi÷hZAZ\{ah`|fgZÎfgoCklahnUbld+ngk w
² d+ngZ\` w YZ\oCkl`Ñ Cg+\ a ì þÞú /£þÞú ;úWý1r<í]îáú éEr<
ù;úÓîo-Rûcþr ;ú ï ýù)ù';°ì þÞú q4r% sngkligi
² d+ngZ\` w YZ\oCkl`Ö og+\\ ifgoCklngk d `CZ\ng[Îd+÷áCodÈigk
indirect
experience
Z iguA`d+kli'fyoWkei'a~dah` }klø\kl÷hZ\C[)ke`|6f wðErËÿ ýuÃZ\÷ v
Ô f Z v $
² ngkm? d _lk!d+÷ÁW Y ` Cbld+i'igkl÷h÷~d+fgaÁ ² v\\\ 8` ¾d+`|f w ÷hah{\kigZ w
i, o i’, o’
b'a~d+÷*ah`|fgkened\b'fgahZ\`Cistklf âklkl` d ngZ\stZ\f d+`} d o|^C[Îd+`
projection estimation bld+ngklf d+{klnmz îZ'5ûcþ Èo ý ñÖý ÈþÞù RG uÃZ\÷ Ô fZ h5
of intention of intention ² ^Cfyfgkln n 7Z\ngfgo
d ò\ ` d+`} ;d+ngngklfgfË Ô !g+\? X od+f
a C^ ngk 7Öieb'ngahCfgahZ\`rZ
d+`CZ\fgoCkln
iÏstklod¶øAahZ\n,fgZ]fgoCk
= []ah`5}irod¶ø\k ah` b'Z\[][)Z\` ahirig5d\b'.k D ig5d+fya~d+÷[]k!bpo w
ah`|fgke`AfgahZ\` d+`})kl[]Z\fgahZ\`rfgod+f,stkligf,k*)|W÷~d+ah`CiÏahf
d+`Cahi'[]iigklngøAah` I¾Z\ah`|føAahig^dÈ÷¯d+fgfykl`AfgahZ\` ah` ah8` ¾d+`b'uR
ñO¶þÞûc þ ó ù -.¶ú 'ù ; ô_1ý È1ý ¨'ù ýúWû ðErËÿ ù £ùÉü \
uÃZ\ ÷ WCw aa?ëz$\v 5+\
² u|ng`WkÅmp j Gg+\ a ï y ý ï þÞú ßÈþÞúE ü îW| ý õ éEÈù ö-ÅûþÞùú
ha ø\klngi
8¾d\b'k0,oCkl` fgoCkluod¶ø\kUkl`b'Z\^C`|fgkengk!}ÎigZ[]klfyoWah` ÷è¶þ£ü;þÞú á' ù ;°9 úWûý >¨þ£ü;ýÈúWÿýR» ^E):ÞZ\ne[ } xÖ`Cahø\klngigahf u sngkligi
,oCZ\i'kø|d+÷h^CkÓ k. ;ied kZ\n}Cd+` kengZ\^Wi6ahiÏ^C`b'÷hk!d+n
"ád+^Cfgke`Wod+oC`
RÓ ½g+\ $ áb'Z\^C÷~} stk¯u\Z\^ e fgoCk¯CoCk w
`CZ\[)ke`WZ\÷hZ a~bld+÷Ñ}ah[)kl`CigahZ\`ÎZ igZKb'a~d+÷2^C`};klngigfed+`}ah`
á :öÁï!â
T ã òWä ä ôÖïöÁõP5ð4öå{T j¿~/É ý ¶úýÈûcþÞ1ÿ ;ú lê
ûcý 0 ó ù -.¶ú p. uÃZ\÷ v\ Ô fZ a
%oCk d+sCah÷¹ahf¾u fgZ a~}kl`|fga, u ,ahfgo Z\fgoCklngi d+÷h÷hZB,i fgoCk Cw Gg2 $?ë g a Lz
" kl`C`CklfyfmG "3 Y 8g+ $ ï è ý 9 úWûýÈúWûcþÞù;ú êWøû úÿýRù q4r%
ngZ\stZ\ffgZ ^C`};klngigfed+`} kl[]d+fgoCklfga~bld+÷h÷hu Z\fgoCklngi
ah`|fgke` w
sngklig i
fgahZ\`Cid+`} +Z\nÇkl[]Z\fgahZ\`Ci
stkloCah`}_fgoCklahn
øAahigahsW÷hkstklod¶øAahZ\n
fed+{|^Cnedë E v\\ ? j
*k KC*k )Aahø\krigoW,a Þf)Z _d+fgfykl`AfgahZ\` ah`
%oCk7ngZ\stZ\f
d+ilbgngahstkliÑfgoCkâah`};ahngk!bgfg÷hu** k )|Xklngahkl`b'k!}Östklod¶ø w
ahZ\náfgZfgoCk[]kl`|fed+÷igfed+fgkÅ÷hah{\kUah`|fgke`AfgahZ\` d+`}Îkl[]Z\fgahZ\`
;d+5d+`Ckligk e ó ' ;ú êWùÿÈþÞýû 'ù ;ô_1ý È1ý ¨'ù ýúWû &ðEr~2
kligfgah[d+fgk!} s|u ^Ci'ah` ý æ;<2¦pý ¸
ýÿÈûþ ù;w ú ` fgkeng[)i Z &ahfgi ÿ ù £ùü hõ ê
Wù þ -. ù;Ó ú ú;ú;ü -.üo ý ô_1ý È1ý ¨'ù ýúWû
û ïR ù ß-. þ 0ü ó ' ;1ú ý8
ZB,` ah`|fgke`AfgahZ\`d+÷hahf uRigkl,÷ w n'6k KWkmb'fgahZ\`fgke÷h÷¹ifgoCk¯ngZ\stZ\ffgoCk
[]kl`|fed+÷ igfed+fgkfgod+f×stkligf}klilbgngahstkli)fgoCkstklod¶øAahZ\
n kÃfgoCk Z _lah[d z v\\ ? 7Ö` ^Ï`|fgZ ke`WkeuZ ZKb'a~d+÷h÷huYZ\[ w
ngZ\stZ\ffgoCkl`CngZ I¾k!bgfgifgoCahi[]kl`|fed+÷ igfed+fgk sd\bp{Z\`|fgZ%fgoCk [_^C`Cacbed+fgahø\k j
Z\stZ\fgim0 9 úWûý Ê;ÿÈûcþ þ û ê -.0Uý 9 ú ûcþ 2
Z\nga ah`5d+÷stklod¶øAahZ\] n >=
a ^Cngk 6 O%oCahiÖahiíoCBZ fgoCk_ngZ\stZ\f û -Åûcý û 9mê9m2þÿÿüÒñÖýÈû £ý ý 0üÐðîü¼êGîýz
^C`};klngigfed+`}iÏZ\fgoCklng
i []kl`|fed+÷ igfed+fgkei Z _lah[d z ` 5W÷cd+fyklø l v\\\ 7Ö` klCa kl`Cklfga~b
%oCahirkl[]d+fgoCklfga~br^C`};klngigfed+`}ah` Z _Z\fgoCklngiÎahir`CZ\f
d+CWngZd\b o fgZo|^C[Îd+` w ngZ\sKZ\f%b'Z\[][_^C`Cacbed+fgahZ\`
¼ 9 úWûý 2
Z\`C÷hu fgoCk×{\klu fgZigZKb'a~d+÷ëb'Z\[][_^C`Cacbed+fgahZ\`
tsC^CfUd+÷hi'ZfgoCk ú ;ûcþÞù;ú ù ß 'ù ù;ú Ïù /ù;3 û ;ú ç-
3;ú 9 úWûý 2
;ÿÈûc þ ý jÃù 30-ÅúþÞrÿ ;ûcþÞù;ú û H÷¿ì(î
{\klu fgZ þ UþÞû ;ûcþ Èý ¨rý ¶úWþÞúü Z igZKb'ahZ w b'^C÷hfg^Cned+÷¯b'Z\`|ø\kl` w
fgahZ\`Ci*ah`b'÷h^5}ah` ÷~d+` ^5d
k []ahfed+fgahZ\` ah\ i FË^d+÷hahfed+fyahøkl÷hu "! #"%$"&%
},a tklngkl`|Ò f ÞngZ\[ kl[_^C÷~d+fgahZ\w ` fiÇ[_^C÷~d+fgahZ\` ahi fgoCkngklCngZK}^b w ')(+*-,/.1032 54 76 ')080%9/( ' ;:<>=/& ?%?%?%@ A (+9/B/0%CDB)E CGF
fgahZ\` Z fgoCkiedÈ[]kngklig^W÷hfUs|u []k!d+`Ci×Z *dÎCngk w *k )Aahigfgah` 9/( 9/(+B/(+CH,1E ,JI-0%C7KMLONHPQI-CRE CH,>I-STIU,VE ,/I-0%C<I-B>I-S 0%9/,1E CH,
stklod¶øAahZ\ned+÷&ngkltklngfyZ\a¹ngk Z\nrZ\`C
k iÎBZ ,` fgnga~d+÷ w d+`} w klngngZ\
n k RW0%9X,/N(+0%9/I-(+BY0%R;NHZS[E C F3(+\%(+*-0 S(1CH, ^] C _ a` E%F3(+* 6
b c Z,J,/(+9/LX0%9/,/N = (aFB @ edVfhgWijijgWklhgWlmdVlenolpqr ? #
ah[]a¹f d+fgahZ\` b'Z\Cahkli¯fgoCk%ah`|fgke`AfgahZ\`d+÷Ç^Ci'k%Z []klfgoCZK}t i ÞZ\n
Z\sCfed+ah`Cah` Zd+÷h] i ² uAng`CkRÑ+\ a 6 O%oCkb'Z\[]tklfykl`b' k ÞZ\n %$ s E Sut9/IvFw%(mxyCI-\%(+9/B/I-,zP A 9/(+B/B
ah[]a¹f d+fgahZ\`rahiÏigtk!b',a +bfg4 Z ç ù UÓ ù r'5þÞýÈú d+`} od+i ahø\kl` {|I-.+.+0%*vE ,/,JI }b y6 4 9/tI-t ' 4
=/& ?%?%~%@ E Cw%ZGE w%(
fgoCkUigtk!b'ahkliáfgoCk d+sCah÷¹ahf¾u fgZigo5d+ngkUah`};ahø|a~};^d+÷7b'ngk!d+fyahZ\`Wi LOI-,/NI-C0%Z9w%91E B
lYgWl}
ekpgWlpo
0%* "&
d+`}áfgZ[d+a¹`fed+ah`UfgoCkl[ ZGø\kln kl`Cklned+fya¹Z\`Cim\b'ngk!d+fyah` ÷~d+` w ;& ~%~ #& ?
E B(1*-*jE ,JI cQ=%%%%@ N(+0%9/P 0%RSTI-CF RW0%9 E NHZ
^d k]d+`}rb'^C÷hfg^Cngkah`rfgoCkCngZKb'kligy i #%XZ\[d+i'kl÷h÷hZ25+\\ 6
^Ï^CnZ\` w Z\ah` d+fgfykl[]Wf fgZ Z ÞZ\igfgklb
n 9 1ú ;<;úùþ &ah`¯fgoCk
igZKb'a~d+÷ SE CD0%IvF 9/0%tr0%, dJDmD dVlDi¡¢lDijgWklD£¥¤¦kleno¢
lpohkl<§
flk3gW¨©kªkijgWp
kl`|øAahngZ\`C[]kl`|Y f ,ah÷h÷|fgke÷h÷^Ci[]Z\ngkd+stZ\^CffgoCk
Cngklngk FË^Cahiga¹fykli
(+9/tr(+9 ¬« r6 I-*-B/0%C r« =/& ?%~%!%@ ©£®¯e3lp°)¤¦kf
xú ; û -.Áý fÞZ\n
d+`}íZ\`|fgZ ke`Wkefga~bCngZKb'kligZ i xú -.¶û -.Áý 2Z CfgoCk
fhlDgWpijgWkl lD ¤¦k±3lgWijgWkl²[³ E 9/\HE 91F xyCI-\%(+9/B/I-,zP
d+ngfg,a +b'a~d+÷igZKb'a~d+÷Wstklah` iÇfgod+f,bld+`rb'Z\`|fgngahsC^CfgkfgZ*fgoCkÏo|^ w
[d+` b'^C÷hfg^CngkfgZ kefgoCklE n ,ahfgoÎ^Ci A 9/(+B/B
SE B(+*-*-0 ' =/& ?%?%?%@ M´µ¤¦£¶i¡£X¢g®±gWlDukn§
%
0
fhl ¤¦k±3lgWijgWkl·³ E 9/\HE 91FTxyCI-\%(+9/B/I-,zP A 9/(+B/B
U TÐN'T ñ8T ïÖõfTÒ ¸D*jE ,J(+\ _ 5=/& ?%?%?%@ N(¹( I-w%(+C(+BI-By0%R¦ST(^E CIUCwºI-C NHZ
_}Cd+[]i! ² h ² gn kmd?_lk!d+÷ÁY h ² ngZAZ\{Aim2jp h ` Cbld+i'igkl÷h÷~d+fgaÁ
7
SE Cºtr(+I-Cw%B E CGF 0%BB/I-t*UPhI-Cº9/0%tr0%,/B H»l ¼lgW¯¢
² | v\\\ z ^C[d+`CZ\ac} j
Z\stZ\fgi
D 7 Ô k6 ah`5} Z gWijq½¤¦k±3lgWijgW¯Di¡3gWe¾e¿À ZCGFÁxyCI-\%(+9/B/I-,zP
%XZAZ\÷Áè 9Âé{éWé 9 Wú ûý>¨þ£ü;ýÈúWûyê
ûcý0WuÃZ\÷ a Ô Zf ¼g2
Cw v a?ë L
A robotics framework for studying
the coevolution of signaling
Andres Perez-Uribe (Andres.PerezUribe@unifr.ch)1
Department of Informatics, University of Fribourg,
Chemin du Musee 3, 1700-Fribourg, Switzerland
Michele Courant
Department of Informatics, University of Fribourg,
Chemin du Musee 3, 1700-Fribourg, Switzerland
Abstract
In this paper, we propose a robotics framework for
studying the coevolution of signaling. Our motiva-
tion is twofold. First, we propose a situated and em-
bodied framework for signaler-receiver interaction,
and second, we provide a promising approach for
the study of mechanisms that would enable adap-
tive systems to access new information channels
and to exploit implicit information in their envi-
ronments. We present experimental results on a
successful coevolution of signals that enable a very
simple communication between two robots. Finally,
we delineate some aspects of forthcoming research.
Introduction
Communication appears near ubiquitous throughout Figure 1: Signaler-Receiver experimental setup.
nature. Animals (and plants) use signals or displays
to convey information to other organisms (Maynard-
Smith and Harper, 1995). Traditionally, biologists models do not deal with abstract descriptions, but
have investigated the function of animal signals e.g. with a noisy and changing environment; moreover,
by experimenting with articial stimuli. More re- robots have bodies and experience the environment
cently, the use of game theory models has been con- directly but imperfectly, and their actions with the
sidered an important change in the elds of ani- environment are part of a dynamic system (Brooks,
mal behavior and behavioral ecology (Dugatkin and 1991) (See Pfeifer and Scheier (1998) for a complete
Reeve, 1998). Evolutionary games have been in- reference to embodied cognitive science). Second,
spired from economic decision theory describing the by gaining new insights into how biological systems
potential interactions of two or more individuals evolve strategies to access new information channels
whose interests do not entirely coincide (Maynard- and exploit implicit information in their environ-
Smith, 1982; Dugatkin and Reeve, 1998). Biologists ments (e.g., how do spiders Agelenopsis aperta get
have used them to model signaler-receiver interac- to know their opponents' size in a web ght -and
tions to study theoretical aspects of a wide range decide whether to ght or to retreat- by sensing the
of animal interactions, including prey-predator in- vibrations of the web (Richiert, 1978)), we might de-
teraction, communication between rivals, potential velop new concepts for the design of sensory process-
mates, ospring and parents, etc. More recently, the ing mechanisms for adaptive systems. We start by
use of articial neural networks and articial evolu- describing the signaler-receiver experimental setup
tionary techniques has provided a useful framework and then brie
y discuss the problems of the noisy
for studying further theoretical aspects of the inter- perception of the robot. The following Section de-
action between signalers and receivers (Arak and En- lineates experimental results of the coevolutionary
quist, 1995; Bullock, 1997; Ghirlanda and Enquist, process, and nally, the last Section presents some
1998). concluding remarks and future work.
In this paper, we propose the use of robots to
study the coevolution of signaling systems. Our mo-
tivation is twofold. First, by using robots we pro-
Experimental setup
vide an experimental setup where signalers and re- In our experiments, two autonomous robots are
ceivers are situated and embodied. This means, robot placed in a box with white walls (Figure 1). We have
used one Khepera robot with a linear vision turret,
1
To whom correspondence should be addressed which contains a 64 1-cell linear image sensor, giv-
ing a linear image of 64 pixels with 256 gray-levels
each. It has automatic light adaptation (auto iris),
and the frame rate we used was 5 Hz. The second
Khepera robot is provided with a MODEROK mod-
ule, a display module with 16 4 LEDs developed
in collaboration with the Ecole d'ingenieurs de Fri-
bourg (EIF).
The two robots are separated by a distance of
about 8 cms (measured between their central axes),
such that the view angle of the robot with the lin-
ear vision system coincides with the width of the
signaler's display (the robots in Figure 1 are shown
separated by a distance longer than that used in the
experiments). In our experiments, for each signaler-
receiver interaction, the signaler has the possibil-
ity to display one of two possible bar-code-like pat-
terns (denoted here by L and R) determined by
its genotype. The receiver perceives the pattern
and uses an articial neural network with synaptic
weights determined by its genotype, to produce its
response (encoded by two binary outputs). A suc-
cessful signaler-receiver interaction is dened as the
interaction where the receiver outputs a '00' pat-
tern in response to a signaler's L signal and '11' in Figure 2: Example of noisy signal perception. Each plot
response to a signaler's R signal. The additional re- on the right shows the activation of the 64 cells of the
ceiver outputs '01' and '10' might be interpreted as a receiver's linear vision system when the signaler displays
the column-based pattern on the left.
receiver response to an unknown signal or to a signal
it is not concerned with.
The genotype of the signaler is a 32-bit string run needed over 5 hours time (each generation was
which encodes 8 real values in the range [0,1) with a executed in about 8 minutes).
4-bit precision. Each real value determines the prob- The experimental setup was conceived with the
ability with which a 4 4-LED column is turned-on idea of establishing signaler-receiver interactions at
in the MODEROK display. The rst 4 real values dierent angles and distances from the very begin-
thus encode the probabilistic L pattern while the ning, however, due to technical problems with the
following 4 real values encode the probabilistic R positioning of the robots, we have so far explored
pattern. Probabilistic patterns were used in order the coevolution of signaling between two immobile
to simulate some noise in the generation of the dis- robots. The setup used so far is thus not really
plays. \strongly" embodied, but as well as robot motion
The genotype of the receiver is 5 16 2 bits can hardly be simulated, so thus noisy perception.
long. It encodes 32 real values equally distributed in As an example, Figure 2 shows the linear vision im-
the interval (-1,+1) with a 5-bit precision (one bit is ages obtained by the receiver given some signaler
used to encode the sign). The real values correspond displays. It appears that a turned-o column acti-
to the synaptic weights of a 16-input, 2-sigmoidal- vates up to 80% the linear vision cells (rst exam-
output perceptron without biases. The inputs to ple of Figure 2), and such activation varies not only
the articial neural network are computed as the with the activation of other 4 4-LED columns (sec-
normalized mean activation of groups of four neigh- ond and third examples of Figure 2), but even with
boring sensor cells of the linear vision sensor of the the activation of LEDs in rows that the linear vi-
receiver robot. The binary outputs of the receiver sion \does not", in principle, perceive (compare the
are computed by comparing the perceptron outputs second and fourth examples of Figure 2): in our ex-
with a threshold of 0.5. perimental setup, it is the lower row of LEDs which
is maximally perceived by the receiver (e.g., a single
Embodiment and noisy perception LED in the lower row may activate a linear vision
It should be noted that even if the robots do not cell up to 100%). Moreover, a pattern with all the
move in these experiments and that they are placed 16 4 LEDs turned-on can be perceived as \noisy"
in a box with white walls, there is a lot of variation as shown in the last example of Figure 2.
in the sensor readings due to inherent noise of the
sensors and changes in the environment due to ex- Articial coevolution of signals
ternal lights. For instance, daylight changed quite We have used the experimental setup described
a lot during the experiment given that a complete above in a coevolutionary robotics experiment
1 1
0.9 0.9
0.8 0.8
0.7 0.7
Fitness (Receiver)
Fitness (Signaler)
0.6 0.6
0.5 0.5
Best Best
0.4 0.4
0.3 0.3
Average
0.2 0.2
Average
0.1 0.1
0 0
0 5 10 15 20 25 30 35 40 0 5 10 15 20 25 30 35 40
Generations Generations
Figure 3: Coevolution of signaling: signaler and receiver's best and average (of the whole population) tness per
generation. Each point of the plots corresponds to the average of 4 runs.
70
60
50
has allowed for postulating shared mechanisms for
40
low-level control of embodied action and higher-level
30
Figure 6: Example of the in
uence of the upper row some biologists concerned with the study of signal-
of LEDs of the signaler's display in the recevier's per- ing systems argue that it is the in
uence of the sense
ception of the pattern '0100' in the lower row of LEDs. organs and nervous systems of an organism, and of
They appear to change the light intensity detected by the environment, which drives the selection of sig-
the linear vision and consequently the level of activation nal precursors (Bradbury and Vehrencamp, 2000).
of the cells (due to the auto-iris), in particular those cor- Therefore, a signaling system arises as the result of
responding to the turned-o columns of the signaler's
patterns. a coevolutionary process between senders and re-
ceivers in which both benet from its use (Zahavi
and Zahavi, 1997). In this paper, we have proposed
sion on the noisy perception and Figure 2). Finally, a robotics framework for studying the coevolution of
it should be noted that once coherent parts of the signaling systems. In particular, we show the suc-
evolved displays enable successful signaler-receiver cessful coevolution of signals that enable a very sim-
interactions, the rest of the displays (i.e., columns ple communication between two robots: a signaler
that turn on and o with similar frequencies) can be robot evolves a couple of column-based patterns, and
a source of bias for the development of future signals. a receiver robot evolves an articial neural network
To further explore the multicomponency issue, we that \interprets" those patterns.
performed a second series of experiments, where we Our experimental framework appears to be a
used the same setup as above, except for the fact promising approach for the study of how to pro-
of using a larger signaler genotype: a 64-bit string vide autonomous articial agents with the ability
encoding 16 real values in the range [0,1) with a 4- to access new information channels (e.g., the sense
bit precision determined the probability with which organs and nervous systems of an organism serve
a 4 2-LED cell was turned-on in the MODEROK many functions besides the detection of and reac-
display. When we devised the rst set of experi- tion to any particular signal (Johnstone, 1998)) in
ments we thought the 4 4 column-based patterns order to develop and continuously adapt their com-
will ease the evolution of signaling given that the munication systems. For example, this will enable
receiver used a linear vision system. Nevertheless, autonomous articial agents to coordinate their ac-
we found that even if the new experiment involves tivities in open-ended environments (Steels, 1999).
a larger search space, the possibility to display pat- Many aspects brie
y mentioned along this paper
terns other than 4 4 columns permits the coevo- can be further studied with this experimental setup;
lution of a signaling system in the same number researchers (Arak and Enquist, 1995; Bradbury and
of generations and even with a better performance Vehrencamp, 2000; Ryan, 1998), for instance, have
(Figure 7). This is due in part to the in
uence of recently promoted the notion of sensory drive as a
the dierent rows of LEDs of the display in the ac- source for new signals, thus we may ask, what are
tivation of the cells of the linear vision (Figure 6). those receiver bias that enabled the evolution of par-
Figure 5 shows an example of the new coevolved sig- ticular signals ? If we let the robots move, what
nals: the new two-row coevolved patterns have the will be the evolved patterns that enable robust com-
property of having probability values of turning-on munication ? Would those signals tend to be sym-
a given 4 2-LED cell closer to 0% and 100% in the metric ? How does multicomponency enhance sig-
lower row. This was expected because the receiver's nal discrimination ? How might it evolve ? What
vision system is linear (i.e., the coevolved patterns does the receiver's neural network learn to be able to
1 1
0.9 0.9
0.8 0.8
0.7 0.7
Best
Fitness (Receiver)
Fitness (Signaler)
0.6 0.6 Best
0.5 0.5
0.4 0.4
Average
0.3 0.3
0.1 0.1
0 0
0 5 10 15 20 25 30 35 40 0 5 10 15 20 25 30 35 40
Generations Generations
Figure 7: Coevolution of signaling (second set of experiments): signaler and receiver's best and average (of the whole
population) tness per generation. Each point of the plots corresponds to the average of 4 runs.
dierentiate the signaler's signals ? What is the ef- and Animal Behavior. Oxford University Press,
fect of perception noise and \interpretation" noise in 1998.
the complexity of the receiver's articial neural net- G. Lako and R.E. Nun~ez. Where Mathematics
work? (Lerena, 2000). Future work will deal with comes from: How the embodied mind brings math-
this and more challenging aspects of coevolving sig- ematics into being. Basic Books, 2000.
naling systems. P. Lerena. Sexual preferences: Dimension and com-
plexity. In From Animals to Animats 6, Proceed-
Acknowledgments ings of the Sixth International Conference on Sim-
The author wishes to thank Patricio Lerena for valu- ulation of Adaptive Behavior, pages 395{404. The
able suggestions and very passionate discussions, MIT Press, 2000.
Dario Floreano for his help with the coevolutionary M. Mataric. Studying the role of embodiment in
algorithm, and the anonymous reviewers for their cognition. Cybernetics and Systems, 28(6):457{
careful reading of a draft version of this paper. This 470, 1997.
work has been supported by the Swiss National Sci- J. Maynard-Smith. Evolution and the Theory of
ence Foundation. Games. Cambridge University Press, 1982.
References J. Maynard-Smith and D.G.C. Harper. Animal Sig-
nals: Models and Terminology. Journal of Theo-
A. Arak and M. Enquist. Con
ict, receiver bias and retical Biology, 177:305{311, 1995.
the evolution of signal form. Philosophical Trans- S. Nol and D. Floreano. Evolutionary Robotics.
actions of the Royal Society, London B, 349:337{ The Biology, Intelligence, and Technology of Self-
344, 1995. organizing. The MIT Press, Cambridge, Mas-
J. Bradbury and S. Vehrencamp. Economic models sachusetts, 1999.
of animal communication. Animal Behavior, 59: R. Pfeifer and C. Scheier. Understanding In-
259{268, 2000. telligence. The MIT Press, Cambridge, Mas-
R. Brooks. New approaches to robotics. Science, sachusetts, 1998.
253:1227{1232, 1991. S.E. Richiert. Games spiders play: Behavioral vari-
S. Bullock. Evolutionary Simulation Models: On ability in territorial disputes . Behav. Ecol. Socio-
their character, and application to problems con- biol., 3:135{162, 1978.
cerning the evolution of natural signaling systems. C. Rowe. Receiver psychology and the evolution of
PhD thesis, University of Sussex, 1997. multicomponent signals. Animal Behavior, 58(5):
L. Dugatkin and H. Reeve, editors. Game Theory 921{931, 1999.
and Animal Behavior. Oxford University Press, M.J. Ryan. Sexual Selection, Receiver Bias, and the
1998. Evolution of Sex Dierences. Science, 281:1999{
M. Enquist and A. Arak. Symmetry, beauty and 2003, 1998.
evolution. Nature, 372:169{172, 1994. L. Steels. The Talking Heads Experiment: Words
S. Ghirlanda and M. Enquist. Articial neural net- and Meanings. Best of Publishing, Brussels, 1999.
works as models of stimulus control. Animal Be- (Pre-edition of the book in progress).
havior, 56:1383{1389, 1998. A. Zahavi and A. Zahavi. The Handicap Principle:
R. Johnstone. Game Theory and Communication. In A missing piece of Darwin's puzzle. Oxford Uni-
L. Dugatkin and H. Reeve, editor, Game Theory versity Press, 1997.
Embodied Intentional Dynamics of Bacterial Behaviour
Catholijn M. Jonker (jonker@cs.vu.nl)
Vrije Universiteit Amsterdam, Department of Artificial Intelligence,
De Boelelaan 1081a, 1081 HV Amsterdam, The Netherlands. URL: http://www.cs.vu.nl/~jonker
DNA desire
activation protein/repressor reason
flux action
Figure 1: The correspondence between the bacterial regulation and the causation in the BDI model.
Left-of
(turn left)
<>=k? <GA.?