Professional Documents
Culture Documents
net/publication/306346584
CITATIONS READS
30 2,406
1 author:
Mikhail Kiselev
Chuvash State University
44 PUBLICATIONS 147 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
Development of Carbon NanoMaterials Genome Approach for Accelerating the Discovery of Advanced Hybridized Functional Materials with Desired Properties: Controllable
Synthesis, Experimental Characterization and Modeling by the Data Science Methods View project
All content following this page was uploaded by Mikhail Kiselev on 10 November 2017.
Abstract—In this paper, we consider a novel approach to set of neurons can encode a great number of different stimuli
information representation in spiking neural networks. In a using neuron firing order and length of inter-spike intervals
certain sense it is a combination of two well-known coding unique for each stimulus. This coding method is noise-
schemes – rate coding and temporal coding. Namely, it is based resistant. It can serve as a natural basis for the working
on asynchronous activity of ensembles of polychronous neuronal memory mechanism [6]. But it does not provide a natural way
groups – groups of neurons firing in determined order with to code continuous values. In the asynchronous approach, the
strictly fixed relative temporal delays. We demonstrate how a evident parameter representing stimulus strength is mean firing
rate-coded input signal may be converted into this representation frequency (rate coding). Another problem is lack of
form by network structures which are formed as a result of
understanding how a temporally encoded signal appears from
network self-organization process based on STDP-style synaptic
plasticity.
asynchronous sources. For example, the well-known work [6]
demonstrates how PNGs storing information about recent
Keywords—spiking neural network; polychronization; STDP; stimuli arise in the chaotic SNN. However, these stimuli were
rate coding; temporal coding; chaotic network; network self- already represented as sequences of spikes strictly fixed in time
organization (this is also true for a more recent research on SNN training [7,
8]). For this reason, it would be important to understand what
kind of SNN structures could implement re-coding the
I. INTRODUCTION
asynchronous signal – at least because most primary sensory
Neurons in spiking neural networks (SNN), which are the signals seem to be represented in the rate-coded form.
most plausible models of biological neuron ensembles, Nevertheless, works considering this kind of conversion
exchange short pulses, called spikes. It is postulated in the remain surprisingly few. For example, a possible role of
majority of SNN models that all spikes have the same form and ensembles of hippocampal pyramidal neurons in this process
amplitude (although, alternative opinions exist – [1]). was considered in [9]. A general approach to the solution of
Therefore, a single spike can hardly be considered as an this problem based on PNGs, which exist or spontaneously
information unit – information is encoded as some properties emerge in a chaotic neural network, was presented by Eugene
of the sets of spikes. Many encoding methods have been Izhikevich in [3]. However, neither this work, nor the
explored but they can be divided into 2 large groups (see an subsequent works devoted to polychronization did not either
interesting discussion on their interrelation in [2]). The first consider, to the extent of my knowledge, the concrete
group uses the mean firing frequency measured within a certain conditions under which this conversion could be realized, or
time window for one or several neurons as an informational report an experimental evidence for its realization. An attempt
parameter. The exact firing time of an individual neuron is not to find a solution to this problem was made by me in my recent
significant in this approach – in this sense it can be called article [10]. It was demonstrated in this work, that conversion
asynchronous. But the exact neuron firing times play an between these two coding schemes could be performed under
important role in the second group – for that reason these certain conditions by a homogenous chaotic neural network.
coding methods are generally called temporal coding. One of However, definitions of PNG and temporal coding used there
the well-studied variants of temporal coding is based on the differed strongly from their original definitions. The concept of
concept of polychronous neuronal groups (PNG) [3]. In this PNG proposed by Izhikevich suggested that PNG activation
approach, presence of some discrete property, appearance of triggered by appearance of a certain stimulus involves all or
certain stimulus and so on is represented as the activity of a almost all the neurons in the group which fire with the pre-
neuronal group specific for this stimulus so that the relative defined delays one after one. The possible absence of few
firing times of all neurons in this group are strictly fixed. There spikes in this sequence is admitted as a noise tolerance
is experimental evidence that the temporal coding is really used measure. However, in [10] PNG response to stimulus was
for information representation and processing in the brain expressed in the activity of the third part of its neurons, at
cortex [4, 5] (while asynchronous coding is utilized mainly in most, – so that reactions to the same pattern might imply
the peripheral nervous system). The PNG-based coding scheme activities of completely different PNG subsets. And further
has a number of advantages, for example, a huge potential research did not show any ways to improve this result.
informational capacity. As it was noted in [3], even the same Although, from formal point of view, PNG activity in [10]
978-1-5090-0620-5/16/$31.00 2016
c IEEE 1355
really contains information about stimuli presented, it is hardly model, synaptic delays between excitatory neurons are selected
possible to use this information for further processing. as if the neurons were randomly scattered on a sphere and
delays were proportional to the distance between neurons (the
The solution proposed in the present paper is based on a maximum delay equals 30 ms). All connections of inhibitory
novel form of information coding in SNN, which is a neurons are very fast – 1 ms. In all other senses, network is
combination, in a certain sense, of asynchronous and absolutely chaotic – for example, the probability that two given
polychronous coding (so we will call it A/P-coding) and neurons are connected depends only on their types (excitatory
inherits advantages of both. Namely, a structure realizing this or inhibitory).
coding scheme (to be referred to as A/P-recognizer) is not one
big PNG but an ensemble of smaller PNGs. Indicator of certain Excitatory synapses are plastic. The plasticity model is
stimulus or property is activation of some threshold number of based on STDP [11]. However, it is known that in recurrent
PNGs from this ensemble. It is required that this threshold SNNs STDP easily leads to instability – active closed circuits
number should not be much less than the ensemble size. Here, of excitatory neurons grow and become more and more active
we say that PNG is active only if all its neurons fire at the due to unlimited growth of weights of their connections caused
prescribed moments of time. But activations of PNGs in the by STDP. As a result, the constant activity of these circuits
ensemble can be completely asynchronous with respect to each may suppress completely all other network activity. In order to
other – in any order and with any time shifts. In contrast with prevent this negative scenario of network development we
temporal coding based on a single PNG, this approach permits introduce a homeostatic amendment to the standard STDP
encoding stimulus strength or value of an attribute as a number model. Namely, value of long-term synaptic potentiation (LTP)
of active PNGs in the ensemble. Comparing it with [10], it depends on neuron activity. Very active neurons lose ability to
could be said that we let different parts of the big PNG be strengthen their synapses. Virtually, we introduce a stabilizing
asynchronous and thus achieve a greater degree of activation of negative feedback in the STDP law. It is realized by
the whole structure making stimulus representation more introducing the new variable η as a part of neuron state. It
stable. At the same time, advantages of PNG-based coding are equals 1 after a long period of quiescence. When a neuron
retained – as before, the same group of neurons can correspond fires, η is decreased by Δη−. After that it grows with the
to many different PNGs and participate in coding many constant speed Δη+ until reaches 1 again. Synaptic weights are
features. limited by the values wmin and wmax because the plasticity law is
Further material is organized in the following way. Since applied to the value of the so called synaptic resource W [12]
our main method is computer simulation, we describe used instead of the weight itself. The relationship between these two
models of neurons and network, simulation of rate-coded input variables is expressed by the equation
signal. Further, we present an algorithm to find A/P-
recognizers in SNN and a procedure to create a network ( wmax − wmin ) max(W ,0)
producing good A/P-recognizers. Result evaluation, discussion, w = wmin + , (1)
and some plans for the development of this research are wmax − wmin + max(W ,0)
included in the final part of the paper.
so when W runs from - to +, w runs from wmin to wmax. In
II. THE NEURONS AND THE NETWORK the starting network state weights of all connections of the
same type were initialized by the same values, but due to
We use the simplest functional model of a neuron – a leaky synaptic plasticity excitatory weights scattered in the range
integrate-and-fire (LIF) neuron [11]. The membrane potential u [wmin, wmax) thus providing the network with some intrinsic
is rescaled so the rest potential value equals 0, while potential structure.
threshold value necessary for a neuron to fire equals 1. The
excitatory synapse model is current-based, inhibitory synapses
are conductance-based. The simplest “instantaneous” synapse III. THE RATE-CODED INPUT SIGNAL
model (delta-synapse) is utilized in which the presynaptic spike At last, let us consider the input signal which we would like
effect on membrane potential proportional to synapse weight is to convert. Our model contains special nodes which serve as
instantaneous and has zero duration. Thus, presynaptic spike sources of external signal. The excitatory and inhibitory
coming to an excitatory synapse with the weight wi increments neurons are connected to these input nodes via excitatory
u by wi while the spike coming to an inhibitory synapse with synapses. The input nodes are a source of a mix of two spike
the weight wa decrease u by the value (u – EI)(1 - exp(-wɚ)), streams – noise and patterns. Noise component is simulated by
where EI is inhibitory reversal potential. Besides that, u random generation of spikes with a certain constant mean
permanently decays with the time constant τ. If the membrane frequency. Each pattern corresponds to some subset of input
potential reaches value 1 as a result of incoming excitatory nodes. These subsets are of the same size and may intersect.
spike, the neuron fires, the potential is reset to 0 and neuron Regularly, with 200 ms period, spike frequency on one of these
becomes unable to fire again during the time interval τR. subsets becomes very high (300 Hz). These high frequency
episodes are short (30 ms). Order of the patterns is selected
Spikes propagate from neuron to neuron with non-zero randomly. It is a typical example of a rate-coded signal. The
delays. Distribution of values of these delays plays a crucial stimulus is expressed by the elevated mean spike frequency
role. It was noted in [10] that the maximum number of while the exact position of individual spikes is insignificant.
potential PNGs exists in the networks where neurons are not
placed chaotically, but form a certain spatial order. In our
TABLE II. THE CHARACTERISTICS OF CONVERSION QUALITY DEMONSTRATED BY THE BEST A/P-CONVERTER
Parameter Minimum selectivity Mean selectivity Total number of Mean PNG size Maximum PNG size
neurons in all PNGs
Minimum 2.83 161 17 3.5 5
Mediane 4.18 182 31 4.8 8
Maxsimum 8.5 224 40 5.7 12
Fig. 1. Network reaction to various patterns. The horizontal axis corresponds to neuron number, the vertical axis is time. The narrow horizontal strips correspond
to 30 ms episodes of pattern presentation. Period of pattern presentation is 200 ms. The fat colored dots are spikes emitted by PNGs entering A/P-recognizer for
the uppermost pattern. Different colors correspond to different PNGs.
optimal from the point of view of A/P-recognizer accuracy was The main result obtained was successful creation of exact
found to be theoretically intractable. It was only possible to A/P-recognizers for all the patterns. The final population of
estimate (very roughly) the network parameter boundaries networks in the genetic algorithm included several networks
confining the search space [13]. For this reason, numerical which contained in their equilibrium state exact A/P-
optimization on the basis of the genetic algorithm was used. recognizers. Interestingly, these best networks had quite
The optimization criterion was the sum of lengths of lists S for different values of parameters – and therefore, we can suppose
all the patterns (as a quickly evaluated parameter). The that the explored effect does not require careful parameter fine-
optimized parameters included number of inhibitory neurons, tuning.
number of synapses of various kinds, various temporal
As an illustration, several most important parameters of the
characteristics (like τ and τR), STDP rule constants etc. – 36
“winner” of the genetic algorithm are summarized in Table 1.
parameters in total.