You are on page 1of 4

Proceedings o f 1993 International Joint Conference on Neural Networks

Learning In Neural Models With Complex Dynamics*


Michael Stiber Jose P. Segundo
Department of Computer Science Department of Anatomy and Cell Biology
The Hong Kong University of Science and Technology and Brain Research Institute
Clear Water Bay, Kowloon University of California
Hong Kong Los Angeles, Caiifomia 90024
email: stiber@cs.ust.hk . .
email: i a q f jpsgmvs oac ucla edu .
Abstruct- Interest in the ANN field has recently focused neuron is a complex entity whose synaptic coding is anything
on dynamical neural networks for performing temporal op- but the simple, monotonic relationship embodied by the typical
erations, as more realistic models of biological information weighted-summation models used in Artificial Neural Networks.
processing, and to extend ANN learning techniques. While Given this information, we :night naturally question how one
this represents a step towards realism, it is important to note of the foci of ANN work, namely learning, can be related to
that individual neurons are complex dynamical systems, in- the actual biological system. Here, learning is considered to
teracting through nonlinear, nonmonotonic connections. The involve the systematic, purposeful modification of neural be-
result is that the ANN concept of learning, even when applied havior. If A” learning is analogous to changing the response
to a single synaptic connection, is a nontrivial subject. characteristics of a synapse (for instance, changing its strength
Based on recent results from living and simulated neurons, by changing its maximum ionic permeability), then how does
a first pass is made a t clarifying this problem. We summarize changing the synapse change the neuron’s behavior? In other
how synaptic changes in a 2-neuron, single synapse neural words, as we modify synaptic strength, how does the synaptic
network can change system behavior and how this constrains coding change? How might this behavioral change be useful in
the type of modification scheme that one might want to use terms of changing the computation of the overall network? And,
for realistic neuron-like processors. based on this information, what constraints might we place on
any scheme for synapse-level “learning”?
I. INTRODUCTION This paper seeks to frame these questions more precisely using
recent results from physiological neural model simulations, data
Biological nervous systems are networks of synaptically-coupled
neurons, each behaving as a dynamical system producing short-
from living preparations, and nonlinear dynamical analysis.
lived voltage spikes. The inpnt/output behavior of each neuron IN LIVINGAND SIMULATED
11. BEHAVIORS NEURONS
can be thought of as the transformation of trains of input spikes,
via synapses, into trains of output spikes. This synaptic coding The living preparation used to serve as our exemplar is the
can be thought of as the operational unit of nervous systerfls embodiment of a prototypical inhibitory synapse, the crayfish
[I]. Therefore, understanding synapses is a necessary, though slowly adapting stretch receptor organ (SAO). The experimental
not sufficient, step towards understanding neural computation. setup, schematized in Fig. 1(A), invo!ved a single presynaptic
When we look at the different types of synaptic coding seen inhibitory fiber and a single postsynaptic neuron [3]. Because
in biological systems, we face at once the need to speak in terms of its prototypical nature, we would expect to see the same re-
of the behaviors of individual neurons, whether in isolation or sponses in any other synapse, at least as a working hypothesis.
in response to some input. These behaviors had been noted in a The model used is based on the physiology of the living prepa-
general sense quite some time ago [2], and have also been the ration and on the the experimental setup; its behaviors have been
subject of more recent work which, using techniques from the found to be in close agreement with that of the living prepara-
field of nonlinear dynamics, has shed a great deal of light on tion [6].
their detailed characteristics [3,4, 51. One clear thing is that the Each neuron produces a series, or train, of spikes. In each
train, each spike was identified and attributed an order and a
‘This work was supported by the Hong Kong Research Grants Council time of occurrence (Fig. 1(B)): k and sk. respectively, pre-
(DAG92/93.EG18), Trent H. Wells, Jr. Inc., and by a grant of computer time
made possible under a joint study with the IBM Corporation on the IBM synaptically, and i and t, post-synaptically (k,i = 0, 1 , 2 , . . .).
3090/600J Supercomputer at the UCLA Office of Academic Computing. Each spike was assigned certain intervals: in I F trains, to the kth

405
s2
I

Presynaptic Rate
‘5
to tlt2

A B Fig. 2. Responses of a dynamical neural model compared to


a weighted-sum ANN unit, in response to inhibitory input. For
dynamical model, thick lines indicate locking regions, while dots
Fig. 1. Experimental setup sketch and analysis term definitions.
indicate non-locked outputs. ANN unit output is shown by the
Experiments were performed on a 2-neuron network (A), with
dashed curve. Figure adapted from [2].
firing of presynaptic neuron (IF, driver) axon the control variable,
and changes in firing times of postsynaptic neuron (SAO, driven)
recorded. Analysis was performed on intervals between times are almost locked, but are not quite periodic [8]. This in-
of occurrence of driver and driven spikes (B). cludes quasiperiodic behaviors, such as phase slidings and walk-
throughs, in which T/ZM p / q . However, in these cases, TI1
is an irrational number, and the neuron’s state never retums pre-
spike, the interval Ik to the last IF firing; in SA0 trains, to the cisely to any previous value.
’i spike, intervals to the last S A 0 and IF spikes, Ti and $i (the The third stationary behavior noted was described as messy,
latter its phase), respectively. The data was thus reduced to the and included both’ erratic (at low pacemaker driving rates,
point processes formed by the ordered sets of intervals Ik (all I > N ) and stammering (at high rates, I 5 N ) ; the former
identical for pacemaker driving, I k = I ) , Ti (with mean value hypothesized to be the work of deterministic chaos [6, 111, and
T ) ,and 4; [3,71. the latter the action of noise. Erratic discharges have no readily
The postsynaptic, driven neuron is a pacemaker. Its undis- apparent patterning. Stammenng, on the other hand, is an ex-
turbed, natural discharge was a sequence T; that differed little ample of windowed behavior, where the S A 0 is able to fire only
from its average T = N [3]. The behaviors described here within a narrow interval of time relative to IF spike arrival, so
were in response to pacemaker driving, in which the presynap- all S A 0 intervals were essentially multiples of I , i.e., T, M kI,
tic spikes were all separated by an essentially invariant interval f o r k = 1 , 2 , ....
I [3]. Though analysis focused on resulting stationary postsy- Finally, hopping was a situation in which the SA0 shifted
naptic discharges, the forms seen in stationary or similar driving occasionally from one type of stationary discharge to another.
are also seen with more complex input regimes [5]. Therefore, This has been assigned to noise “bumping” a system among
we consider such forms as the elementary building blocks of several dynamical attractors.
synaptic behavior.
The behavior associated with each inhibitory train was iden- A. Behaviors Wewed in Terms of Average Rates
tified as locked, intermiuent (including phase walk-throughs), Let us consider the effects of changing the input rate in this
messy (erratic or stammering), or hopping, after those defined preparation and, to simplify matters, we will look at average
in nonlinear dynamics [8]. Criteria were based on spike tim- rates, 111 and 1/T, rather than details. This reduces the ob-
ings of the presynaptic and postsynaptic trains individually and served behaviors to the level of detail seen in ANNs with unit
jointly as determined using the data analysis techniques de- output a scaled version of 111 or l/T.
scribed in [3, 9, 101. Fig. 2 schematizes the outputs of an ANN unit with weighted-
A locked response is defined as a fixed, repeating sequence sum and squashing function (dashed curve) and 1/T for either
of 4, and T,, and a behavior was called “locked p:q” if these the S A 0 or the physiological model (solid lines and dots), as a
sequences repeated every q SA0 spikes (4;= 4;+n and Ti = function of input frequency for inhibitory input. The key features
T,+,) snd p IF spikes (so p l = qT). Locking is a periudic to note are that the “realistic” resp~nscsconsist of paradoxical
behavior, where the “internal state” of the neuron returns to the regions of locking (thick lines), where increasing inhibitory in-
same value after a period of q S A 0 and p IF spikes. put increases output rate, altemating with non-locked behaviors
Intermittent is a descriptive term used for behaviors which (dots) [ 2 ] .While the overall trend is decreasing, it is only locally

406
1x10-6

-.->,
c
B
!ti 1x10.~
n
.-
w
0 1 1 1

8. fi f2 f3
f.
cn *
f, inputfrequency
E
‘8 1x10-8 Fig. 4. Illustration of effects of changing synaptic strength, w ,
on neural behavior. The shaded area represents part of a locking
A tongue in an Amol’d map. A small change in synaptic weight
3Y
from w1 to w2 can change a non-locked behavior at f3 to a
locked one, or have no effect at all, such as at fi or fi.
Ini
+ +
between p : q and p’ :q‘ being p p’ : q q’ (not readily ap-
1 parent in the figure because of the coarse sampling of the pa-
1~10-~ I rameters). Between these tongues lie parameter regions which
I I I I I I I I
produced nonlocked behaviors (cross-hatched areas), including
0.2 0.4 Q 6 0.8 1.2 1.4 1.6 1.8
all of those seen in the Iiv5g preparation. The “horizontal slice”
NA, Normalized Input Frequency at any paaicular value of P , corresponds to a one-dimensional
bifurcation diagram; if we were to plot average postsynaptic rate
Fig. 3. Amol’d map from simulations with inhibitory input. instead of behavioral category, we would produce a graph similar
White areas indicate parameter ranges which produced lockings to Fig. 2.
(ratios noted). Cross-hatched areas are non-locked behaviors. How does changing the synaptic strength change the behavior
pattem? The locking tongues extend all the way down to values
of Fsy,,that no longer have significant practical effects on behav-
monotonic, and those monotonic areas’ slopes are opposite the ior. However, they narrow as pJy,, decreases. This corresponds
overall trend. In contrast, the output of a typical ANN model is to reducing the widths of the paradoxically-slopedline segments
a monotonic and smoothly decreasing. in Fig. 2. Noting the preponderance of walkthroughs among the
nonlocked behaviors at low Fv,, [6],we see that the overall ef-
111. SYNAPTIC MODIFICATION: EFFECTS ON BEHAVIOR
fect will be to make the synaptic mapping more monotonic, as
One advantage of simulation over experiment is that the power far as average rates are concerned.
of the synaptic coupling can be altered. This change in con- For higher values of Po,,, we see pronounced locking tongues,
nectivity can be considered analogous to changing the weight amounting to about one-third of the input rate scale in the
w in a weighted-sum ANN. The resulting behaviors are conve- S A 0 [4]. Varying ps,,,, within this range can have either no
niently summarized by an Arnol’d map [12] or two-dimensional effect or dramatic effect on behavior, as illustrated by the dia-
bifurcation diagram, shown in Fig. 3, which illustrates the type gram of a locking tongue presented in Fig. 4, where the ANN-
of output behavior as a function of input frequency (normal- like terms of w and f have been substituted for Fsy,, and N / I ,
ized as N / I ) and input amplitude (Fv,,, the maximum synaptic respectively. For presynaptic frequencies fi and f2, changing
permeability to Cl-, the inhibitory carrier ion here). the synaptic strength from w1 to w2 has no effect on their cor-
We see in the figure that a locking at any particular ratio (the responding postsynaptic rates: this is true for each frequency
labeled white areas in the graph) occupies a contiguous, veni- fj within the tongue at both WI and w2. For this range,
cally elongated region of the (Po,,, I ) plane; hence their usual A(behavior)/Aclwlf,,, = 0. Conversely, for f 3 , a change from
appellation, tongues. The widths of these tongues vary sys- w1 to w2 induces a change from nonlocked to locked behavior.
-
tematically, according to a Farey series, with the widest tongue This can be true even if w2 w1 is infinitesimallysmall, if f3 is

407
c!ose enough to the bifurcation point represented by the tongue REFERENCES
boundary, Besides the lack of change in behavior within tongues [ I ] J. Segundo, J.-E Vibert, M. Stiber, aiid S . Hanneton,
and the abrupt changes in behavior near tongue boundaries, there “Synaptic coding of periodically modulated spike trains,”
are behavior changes among the nonlocked behaviors between in IEEE International Conference on Neural Networks,
the tongues, which include walkthroughs and messy behaviors 1993.
in about equal proportions in the S A 0 [4].
[2] J. Segundo and D. Perkel, “The nerve cell as an analyzer of
IV CONCLUSIONS
spike trains,” in The interneuron: UCLA Forum in Medi-
Though we cannot state a definitive synaptic modification rule cal Sciences (M. Brazier, ed.),(Los Angeies), pp. 349-89,
currently, the above observations can serve to constrain some University of California Press, 1969.
of its qualities First of all, it is important to note the complex
responses of neurons to inputs with different interspike interval [3] J. P. Segundo, E. Altshuler, M. Stiber, and A. Garfinkel,
patterns, changes in input frequency, and changes in synapse “Periodic inhibition of living pacemaker neurons: I.
power. In recurrent ANNs, the effects of dynamics (such as bi- Locked, intermittent, messy, and hopping behaviors,” Int.
furcation behavior) on learning rules must either be eliminated J. Bifurcation and Chaos, vol. 1, pp. 549-81, September
or carefully controlled to achieve the desired results [13]. It is 1991.
our feeling that a synaptic modification procedure in networks
[4]J. P. Segundo, E. Altshuler, M. Stiber, and A. Garfinkel,
of more realistic elements should take advantage of the individ-
“Periodic inhibition of living pacemaker neurons: U. In-
uals’ dynamics; greater complexity at that level should lead to
fluences of driver rates and transients and of non-driven
greater network computational power. To be far, however, it
post-synaptic rates,” Inf. J. Bzfurcation and Chaos, vol. 1,
could certainly be argued that, in large networks with high in-
pp. 873-90, December 1991.
terconnectivity, the individual elements’ detailed dynamics are
washed out. [ 5 ] J. Segundo, M. Stiber, E. Altshuler, and J.-E Vibert, “Tran-
Based on the previous discussion of coding at small and large sients in the inhibitory driving of neurons and their post-
synaptic strengths, we can make some general conclusions. With synaptic consequences,” Neuroscience, submitted, 1993.
weak coupling, the coding through the synapse is more mono-
tonic, since the locking regions are narrow The tradeoff for this [6] M, Stiber, Dynamrcs of Synaptic Integration. PhD thesis,
is that, as the synapse is made weaker, the effect of input at that University of California, Los Angeles, 1992.
synapse is reduced. Weak synapses would have to gain a notice-
[7] D. Cox and P. Lewis, The Statistical Analysis of Series of
able effect through correlation at multiple sites. However, strict
Events. London: Methuen, 1966.
temporal correlation at multiple sites would tend to produce the
same types of behaviors as a single strong synapse. Therefore, [8] P. Berg4 Y. Pomeau, and C. Vidal, Order Within Chaos: A
for monotonic coding through multiple weak connections, steps Deterministic Approach to Turbulence. Nea York: Wiley,
would need to be taken to reduce synchronization among the 1986.
inputs, so that the postsynaptic unit would respond to overall
input intensity, rather than input pattern. [9] M. Stiber and W. J. Karplus, “Amol’d map synthesis for
With higher values of FJyn (or multiple correlated inputs), periodically forced oscillators,” Tech. Rep. HKUST-CS93-
a complex code can anse, whose functional significance is cur- 5 , HKUST Computer Science Department, 1993.
rently unknown. It is important to note, though, that the arrange-
[IO] M. Stiber and J. P. Segundo, “Nonlinear dynamics and
ment of different behaviors is not random; there is a systematic
complex synaptic transfer functions,” in IEEE Infernational
progression from one to another [4].Potentially, then, this com-
Conference on Neural Networks, 1993.
plexity could have useful computational applications. A learning
scheme at this level of coupling would essentially be adjusting [ I l l G. Sugihara, D.Grace, J. Segundo, and M. Stiber in prepa-
the proportions of the different behaviors, rather than their loca- ration, 1993.
tions along the input rate scale (the latter being determined by
the neural dynamics itself). [12] V. Amol’d, Geometrical Methods in the Theory of Ordinary
We can conclude that any synaptic modification process im- Differential Equations. New York: Springer-Verlag, 1983.
plemented for dynamical, neuron-like elements must take into
[13] K. Doya, “Bifurcations of recurrent neural networks in gra-
account: the tradeoff between synaptic monotonicity and effi-
dient descent learning,” IEEE Transactions on Neural Net-
cacy, synchronizabon and desynchronization of inputs, and con-
works, submitted, 1993.
trol of behavior proportionality. Exactly how these issues should
be dealt with depends on the computational significance of the
different behavior types, a topic of ongoing investigation.

You might also like