You are on page 1of 12

Introduction

Skin plays a vital role in shaping our interactions with the world. We can effortlessly distinguish between a
smooth and hard surface. The ability to restore these capabilities to people with skin damage or amputations
could provide an improvement in quality of life. Robotic prosthetics can already mimic many of the
mechanical properties of biological hands, and adding skin-like sensory capabilities could improve their
acceptance and utility among amputees by providing several key advantages. First, providing sensory
feedback from a prosthetic limb provides the perception that the prosthetic is part of the user’s body,
promoting a sense of ownership. Second, providing tactile feedback could allow more natural and facile
operation by restoring information about body positioning (proprioception) and grip forces. A lack of sensory
feedback is currently a limiting factor for prosthetic devices.

To restore a natural sense of touch, it is important to understand and mimic the key factors affecting the
sensory properties of biological skin. The sensory receptors in human skin can be classified into seven general
types: pain receptors, cold receptors, warm receptors and four mechanoreceptors that measure mechanical
stimuli. Each of the four types of mechanoreceptors (Fig. 1) measures forces on different time-scales. Slow
adapting receptors (SA-I and SA-II) respond to static pressures; that is, they produce a sustained signal in
response to a sustained stimulus. Fast adapting receptors (FA-I and FA-II) respond to dynamic forces
(derivative of force with respect to time) and vibrations. SA-I receptors are located near the surface of the
skin and respond to skin indentations with high sensitivity. SA-II receptors are located deeper within the skin
and are primarily responsible for measuring skin stretch; therefore, they are important for proprioception.

Peripheral nerve fibers transmit mechanoreceptor responses to second-order neurons in the cuneate nucleus
of the brainstem according to a somatotopic organization. Cuneate neurons decode and re-encode primary
signals prior to their transmission to thalamic and subsequently to cortical areas that mediate downstream
experience-dependent haptic percepts. Ultimately, adaptive control closes the perception–action loop
subserving active haptic sensing, e.g. for object manipulation and recognition.

Although both skin mechanoreceptors and artificial pressure sensors respond to compression, their outputs
are quite different. Mechanoreceptive afferents in the skin produce trains of discrete action potentials, or
spikes whose rates represent salient stimulus features such as magnitude or velocity. In contrast, artificial
force sensors output a continuous analog signal, typically with changes in resistance or capacitance that
represent intensity.

1
Fig.1 Skin receptors and transduction process. (a) Different cells in the epidermis. (b), A schematic of the
location of mechanoreceptors in the skin. (c), Types of mechanoreceptors, their function, temporal response
and density in the hand. RF, receptive field size; SA-I and SA-II, slow adapting receptors; FA-II and FA-II,
fast adapting receptors. (d), A schematic of the steps required for the transduction of sensory stimuli from
natural receptors in the brain. The signals from receptors are transported through nerve fibers, with each nerve
fiber containing information from many receptors. The ensemble output of information from these receptors
is interpreted by the brain to give complex information about object size, shape, texture and hardness.

2
Neuromorphic biorobotics offers the possibility to emulate a biological sense of touch, with variable
characteristics and design features, to selectively evaluate their related effects via artificial spiking
implementations. This approach fosters in turn the design of new architectures for artificial tactile sensory
systems to enable various application such as human-augmentation and rehabilitation. The most recent studies
based on MEMS tactile sensor investigate the neuromorphic conversion of raw sensor outputs to spike codes
conveying tactile information. Such neuromorphic artificial tactile system grounds on the natural coding
observed with microneurographic human recordings and was implemented reproducing mechanosensor
dynamics via spiking neuron model.

The design and implementation of neuromimetic architectures based on neural coding principles for fine
active perception not only may help in embedding neuroprosthetic devices with biologically plausible sensory
feedbacks but also, the implementation of neuromorphic sensing technologies based on spike codes may
foster the development of haptic robotics for real-world applications

The spatiotemporal nature of tactile stimuli, as well as the noisy sensors and environments, in which they
operate, make the perception of touch a complex problem. To address these issues, we use a biologically
spiking neural network to process the collected tactile information. The present report introduces a tactile
neuromorphic-system that has a sensorized skin. This architecture not only is useful for neuro-robotic
applications but also it makes possible to explore different neural mechanisms of tactile sensing in real-world
environments.

Materials and Methods:

Thus far, promising results have been recently achieved to restore information about the touch of objects (Tan
et al., 2014 ) and also the level of produced grasping force (Raspopovic, 2014 ). The restoration of ability to
judge textural features represents the next significant step towards the re-establishment of close-to-natural
sensory skills of a natural hand. Here, we try to achieve this goal via an integrated approach to mimic natural
coding using a neuromorphic, real-time process through a sensorized artificial skin. In this framework, the
temporal coding of tactile information was based on the use of a biologically plausible neural model which
has shown promise as a versatile and computationally efficient framework for reproducing a wide range of
phenomenological neural responses to stimuli. The investigation of neuromorphic encoding might help
evaluating different candidate mechanisms for tactile information transduction and processing.

Force detected at the sensor, f(t), (shown in Fig. 2) was transformed into current linearly. Indeed, Eq. (1)
used here, linearly convert sensor-detected force into current, I(t), in mA.

I(t)=k1 +k2 f(t) (1)

3
k1 and k2 are the intercept constant and static gain, respectively. This linear transformation responds to the
magnitude of force and contributes mainly during the sustained hold. Next, the transmembrane current is
transformed into spike times using leaky integrate-and-fire (LIF) neuron model based on a one-to-one
connection scheme. This network of LIF neurons acted as a population of cutaneous fingertip
mechanoreceptors, which converted analogue skin deformations following mechanical indentations onto
spiking spatiotemporal patterns.

The development of a realistic spiking mechanoreceptor and overall architecture still needs to be
consolidated, and thus, we do not proceed directly to the artificial neuromorphic implementation in hardware.
Rather, we undertake a soft neuromorphic approach through the LIF model, to be able to tune the neuron
firing parameters. This is a preparatory step toward the future hardwired implementation of the mechanoneuro
transduction processes.

Fig. 2. The sensor detected force.

Primary afferent signals are processed by second order neurons in the cuneate nucleus (CN) of the brainstem,
the brain’s first lev el of tactile processing, which constitutes the main synaptic relay along the somatosensory
pathway from the fingertip to the central nervous system. The functional link between first and second order
neurons (i.e. mechanoreceptors and cuneate cells) has not been thoroughly investigated, and experimental
and computational findings on how information is processed along this pathway are still lacking.

We propose a neurorobotic framework to study neural coding at the different levels of tactile sensing as
shown in Fig.3. This is a novel neuroengineering framework for robotic applications based on the multistage
processing of tactile information. To the best of our knowledge, a unified framework for multistage neural
coding underlying peripheral-to-central transmission of tactile perception and their use for active exploration
and neuroprosthesis is not addressed well.

4
LIF

Output neurons
Skin
A Spiking Neural Network
Ex. Counting the number
Each gray box is consisted of which receives spike trains
of spikes to determine
single LIF neuron which receives and then processes it to
the winner neuron.
external force and produce produce output signal
appropriate spike train.

Fig.3. Overview of the neurorobotic framework and encoding/decoding pathway. The artificial sensorized
skin. We employ tactile stimuli to indent an artificial touch sensor. The analogue responses provided by the
touch sensor drive a network of LIF neurons converting analogue signals into spiking activity and mimicking
fingertip mechanoreceptors. The spiking neural network implementing a population of brainstem cuneate
nucleus (CN) cells since it receives the combined inputs of the biomimetic artificial mechanosensors.
Following recent hypotheses on the way tactile information is processed pre-cortically we used a structure of
the connectivity between the layers such that the post-synaptic neurons are able to encode the contact with
the object.

The sensorized skin translates applied stimulus into the injection of current pulses into the nerve. It mimics
the neuronal activity recorded during human microneurographic experiments. The stimulus influences the
first-spike latencies of the responsive afferents. The substantial divergence and convergence of primary
afferents onto second-order neurons and the dispersion of conduction velocities among afferents provide
parallel processing of the temporally structured information that aids feed-forward rapid classification of
information by temporal-to-spatial conversion.

The spiking neural network consists of LIF neurons. The network model was composed of two neuronal
populations: 4 inhibitory interneurons and 16 pyramidal neurons and was simulated in C++. The network
connectivity was random with 0.2 probability of directed connection between any pair of neurons, so that
more than 80 excitatory and inhibitory synapses have considered in the network. Each neuron k is described
by its membrane potential 𝑣𝑘 (𝑡) that describes as follows:

𝑑𝑣𝑘
𝜏𝑚 = −𝑣𝑘 (𝑡) + 𝐼𝐴𝐾 (𝑡) − 𝐼𝐺𝐾 (𝑡) (2)
𝑑𝑡

where 𝜏𝑚 is the membrane time constant (20ms for excitatory neurons, 10ms for inhibitory neurons), 𝐼𝐴𝑘 is
the (AMPA type) excitatory synaptic currents received by neuron k, 𝐼𝐺𝑘 are the (GABA-type) inhibitory

5
currents received by neuron k. In (2) the resting potential is zero. When the membrane potential crosses the
threshold 𝑣𝑡ℎ𝑟 (18 mV above resting potential) the neuron fires, then neuron potential resets at a value 𝑣𝑟𝑒𝑠
and the neuron can’t fire again for a refractory time (𝜏𝑟𝑝). Synaptic currents can be obtained using auxiliary
variables 𝑥𝐴𝑘 and 𝑥 𝐺𝑘. AMPA and GABA-type currents of neuron k are described as follow:

𝑑𝐼𝐴𝑘
𝜏𝑑𝐴 = −𝐼𝐴𝑘 + 𝑥𝐴𝑘 (3)
𝑑𝑡

𝑑𝑥 𝐴𝐾
𝜏𝑟𝐴 = −𝑥𝐴𝑘 + 𝜏𝑚 (𝐽𝑘−𝑃𝑦𝑟 ∑𝑝𝑦𝑟 𝛿(𝑡 − 𝑡𝑘−𝑝𝑦𝑟 − 𝜏𝐿 ) + 𝐽𝑘−𝑒𝑥𝑡 ∑𝑒𝑥𝑡 𝛿(𝑡 − 𝑡𝑘−𝑒𝑥𝑡 − 𝜏𝐿 )) (4)
𝑑𝑡

𝑑𝐼𝐺𝑘
𝜏𝑑𝐺 = −𝐼𝐺𝑘 + 𝑥 𝐺𝑘 (5)
𝑑𝑡

𝑑𝑥 𝐴𝐺
𝜏𝑟𝐺 = −𝑥𝐴𝐺 + 𝜏𝑚 (𝐽𝑘−𝑖𝑛𝑡 ∑𝑖𝑛𝑡 𝛿(𝑡 − 𝑡𝑘−𝑖𝑛𝑡 − 𝜏𝐿 )) (6)
𝑑𝑡

where 𝑡𝑘−𝑝𝑦𝑟,𝑖𝑛𝑡,𝑒𝑥𝑡 is the time of the spikes received from pyramidal/interneurons connected to neuron k or
from the external input. 𝜏𝑑𝐴 (𝜏𝑑𝐺 ) and 𝜏𝑟𝐴 (𝜏𝑟𝐺 ) are respectively the decay and rise time of the AMPA-type
(GABA-type) synaptic current. 𝜏𝐿 = 1 ms is the latency of post-synaptic currents. 𝐽𝑘−𝑝𝑦𝑟,𝑖𝑛𝑡,𝑒𝑥𝑡 is the efficacy
of the connections from pyramidal neurons/interneurons/ external inputs on the population of neurons to
which k belongs. External input and the largest part of the noise is assumed to come from the thalamus. The
values of these parameters for all types of synapses are taken from Mazzoni et al., 2008. Using numerical,
fourth-order Runge Kutta method for solving ordinary differential equations, the membrane potential is
calculated.

Simulation results
Research on the central processing of various sensory modalities indicates that the precise timing of neural
discharges can carry more information than firing rates alone. The significance of first spikes, in particular,
has been emphasized for the auditory, visual and somatosensory systems. However, the existence of effective
codes based on spike timing does not exclude the possibility that average firing rates also carry information
in neural networks. The lack of a consistent relationship between the latencies for response onsets and firing
rates in tactile afferents suggests that these two codes in fact provide independent information about tactile
events. It is possible that different codes are used by different processes and by the different pathways that
use tactile afferent information. For example, relative spike timing may primarily support fast stimulus
classification in the control of action, which operates on rapidly varying signals. Firing rates, by contrast,
might preferentially support perceptual mechanisms that operate under less time pressure and, often, on
steadier signals. Furthermore, the fact that the two codes seem to convey similar information but in apparently
independent ways suggests that they represent complementary monitoring systems. This might be useful for
learning, verifying and upholding the function of various control processes.

6
The tactile stimuli is decoded starting from the features of the spike trains emitted by single channel. In the
first set of simulations, we focus on the firing rates coding paradigm. We argue that the design of the
somatosensory pathways could enable rapid classification of tactile stimuli by temporal‑to‑spatial
conversion at the level of second‑order neurons (in the cuneate nucleus and spinal cord), which may function
as coincidence detectors.
Our first quantified comparison of the responses of the cuneate neurons is based on their response amplitudes,
i.e., the total number of evoked spikes, to the different stimulation patterns. In this respect, the neurons form
complementary distributions. Hence, it is possible to segregate the neurons to some degree solely from the
crude information given by the total number of spikes in the responses. For each stimulus type, the higher
speeds tended to generate responses with more spikes.
Stimulation of the sensor located in the second row and second column of 3*3 sensorized skin is shown in
Fig.2. In this simulation we stimulated one of the sensor by the rectangular pulse with different pulse widths
and fixed amplitude. As it can be seen, the firing rate of the corresponding neurons in the output layer has
increased and the winner neuron can be determined just by counting the number of spikes for the duration of
stimulation. Indeed, in this case we apply a fixed force with different widths and thus firing activities of the
output spiking neurons are increased accordingly. The color plot is obtained by adding the number of spikes
generated by the neurons for corresponding row and column.
The firing rates of primary somatosensory cortical neurons to gratings and Braille-like dot patterns have been
shown to be related to the perceived roughness of these stimuli (Chapman et al. 2002). This result is
compatible with the finding that vibratory amplitude is encoded in cortical firing rates (Harvey et al. 2013)
and strongly related to perceived roughness.

7
Input Output
1 250

0.9 350

0.8 200
300

0.7

250
0.6 150

Spike Number
Force

Row
0.5
200

0.4 100
150
0.3

0.2 50 100

0.1
50
0 0
0 100 200 300 400 500 600 700 R1 R2 R3 C1 C2 C3 Column
Time(ms)

Input Output
1 350 550

0.9 500
300
0.8 450

0.7 250 400

0.6 350
Spike Number

200
Force

Row
0.5 300

150
0.4 250

0.3 200
100

0.2 150

50
0.1 100

0 0 50
0 200 400 600 800 1000 1200 R1 R2 R3 C1 C2 C3 Column
Time(ms)

Input Output
1 400

0.9
350 600

0.8
300
0.7 500

250
0.6
Spike Number

400
Force

Row

0.5 200

0.4
150 300

0.3
100
0.2 200

50
0.1

100
0 0
0 500 1000 1500 R1 R2 R3 C1 C2 C3 Column
Time(ms)

Fig.4. Stimulating the sensorized skin. Left panels: The amplitude is fixed and the pulse width is increased.
Middle panels show the firing rate of the output spiking neurons. The right panels display the sum of spikes
generated by the output neurons for corresponding row and column.

8
Input Output
1 250

0.9 350

0.8 200
300

0.7

250
0.6 150

Spike Number
Force

Row
0.5
200

0.4 100
150
0.3

0.2 50 100

0.1
50
0 0
0 100 200 300 400 500 600 700 R1 R2 R3 C1 C2 C3 Column
Time(ms)

Input Output
1 300

0.9 400
250
0.8
350

0.7
200
300
0.6
Spike Number
Force

Row
0.5 150 250

0.4
200
100
0.3
150
0.2
50
100
0.1

0 0 50
0 100 200 300 400 500 600 700 R1 R2 R3 C1 C2 C3 Column
Time(ms)

Input Output
1.5 350
500

300 450

250 400
1

350
Spike Number

200
Force

Row

300
150

0.5 250
100

200

50

150

0 0
0 100 200 300 400 500 600 700 R1 R2 R3 C1 C2 C3 Column
Time(ms)

Fig.5. Stimulating the sensorized skin. Left panels: The pulse width is fixed and the amplitude is increased.
Middle panels show the firing rate of the output spiking neurons. The right panels display the sum of spikes
generated by the output neurons for corresponding row and column.

We found that, on average, different stimuli elicited responses with differences in both firing rate and firing
patterns. This suggests that neuromorphic spike trains might contain enough information to convey an
adequate sensory feedback to upper limb neuroprostheses and to be useful in robotic applications. Finally,
this work provides a basis for the design and implementation of modular neuromimetic systems for touch
discrimination in robotics and neuro-prosthesis.

9
Future/related works:

We explained that spike patterns are not only rich enough to allow for a correct off-line decoding of
naturalistic stimuli but also it is possible to extend this approach with the aim to process stimuli in real-time
via a neuro-bio-inspired architecture. This biomimetic approach makes possible to decode stimuli while the
tactile data stream is gathered and not at the end of the process. Here we discuss future works:

1- The artificial tactile sensing system developed so far does not support learning. Only a few studies
addressed learning issues, so extending current neuromorphic models to include learning abilities of natural
touch sensing (specially unsupervised learning) has great importance. Considering Fig.6, we can propose
Fig.7, which shows a unified framework for multistage neural coding underlying peripheral-to-central
transmission of tactile perception.

Excitatory connections were subject to Spike Timing Dependent Plasticity (STDP). This structured-SNNs
can learn hand movement patterns across the sensorized skin and is able to discriminate different movements
too. This is various applications in neuro-robotic platform and neuroprosthesis.

Fig.6 Schematic illustration of the position of the cuneate nucleus neurons in relation to the primary sensory
afferents from the skin and the structures of the brain’s somatosensory system. Even though there are multiple
afferents of the same submodality innervating the same skin area , each primary afferent may still carry
specific ‘‘flavors’’ of the tactile event, e.g., depending on the shape of a touched object, as suggested by the
color code in the figure. When a particular skin region is stimulated, a number of primary afferents innervating
this region are activated and generate excitatory postsynaptic potentials (EPSPs) in all the cuneate cells that
receive input from that same skin region.

10
Somatosensory Cortex
Sensorized Skin Thalamic Neurons (544 Excitatory
Cuneate Neurons (256 Excitatory ) 136 Inhibitory)

Fig.7, The SNN simulation was composed of three populations of neurons as shown in Fig. 4. One population
for cuneate neurons as discussed before. A population of 544 excitatory neurons and 136 inhibitory neurons
form a spiking neural network that loosely corresponds to primary somatosensory cortex, and a population
of 256 excitatory input neurons that simulates thalamic neurons relaying touch information to the simulated
cortex. Each neuron in the somatosensory cortex received more than 100 synapses, which were randomly
chosen from both input neurons and other somatosensory cortex neurons. These connections had delays
ranging from 1 ms to 10 ms to mimic the variation in axonal conduction observed in cortex.

2- Neural prostheses, such as cochlear and retinal implants sense the environment using artificial sensors
convert data from these sensors into neural signals, and apply a pattern of electrical stimulation to a neural
epithelium. For the sense of touch, the hand is the principal sensory organ and feedback from the hand is
critical in manipulation of objects. In a sensorized prosthetic limb, force signals from sensors on the fingertips
of the prosthesis are converted by the model described in this report (and can be fabricated in hardware) into
the spike trains. The spike trains are then effected into the residual nerve fibers through electrical stimulation.
The detailed structure of the proposed system is shown in Fig. 8.

11
Computational
Spiking Neural Network on Neuroscience
SpiNNaker / FPGA

Spike trains

Sensor
detected force

Electrical Stimulation Tactile perception

Sensorized skin

Stimulating amputated limb


EEG analyzing Neuro-Robotic applications

Fig.8 The different pathways of research in artificial neuromorphic tactile sensing. The SNN can be
implemented in hardware on an FPGA or SpiNNaker board to investigate different bio-inspired algorithms
of tactile sensing obtained in computational neuroscience. The resulted spike train can be separately
considered to address robotic applications and artificial tactile perception or can be used as electrical
stimulation pulses to be delivered to amputated limb and/or intact limb to investigate the performance of
touch sensing prosthesis.

12

You might also like