Professional Documents
Culture Documents
Disha Sharma
dishasharma210@gmail.com effectively explored in the molecular level. The research in
+91-7837448470 molecular biology contributed to the knowledge about ion
channels and receptors, two important elements in neural
T
Abstract - Although the nervous system and nervous cells have signaling, making it possible to describe the first molecular
been studied since ancient times, almost all the important structure of an ionic channel [KSJ00]. This enhanced capacity
advances on the knowledge base took place along the past of neuronal modeling came to shed some light into questions
century, when neuroscience evolved and developed more complex like: How do the nervous cells communicate among them?
and detailed neuron models. However, engineering applications How is this communication modified by experience? How
were – and still are – practically limited to the basic neuron
models and its variations. In traditional artificial neural
different interconnection patterns originate different
ES
networks, the neuron behavior is described only in terms of firing
rate, while most real neurons, commonly known as spiking
neurons, transmit information by pulses, also called action
potentials or spikes. From these considerations a major question
perceptions and motor actions?
In the last ten years all the main questions formerly addressed
by neuroscientists only to cellular biology began to be
II. INTRODUCTION Figure.2. shows a spiking neuron with multiple synapse inputs
Decades passed by since the introduction of Artificial neural [2]. The inputs comprises of a sequence of unit amplitude
networks (ANN), the fairly old technique is still on the path of digital spikes having a very short duration. A sequence of
improvement coming up with new generations of neuron inputs entering a synapse is called a spike train. The inter-
models. McCulloch-Pitts gave the first generation of ANN spike times and their sequence order form the input and the
based on a simple threshold neuron model: if the sum of its response function is based on Leaky Integrator Fire Neuron
weighted incoming signals rises above a threshold value, a (LIFN) model.
neuron sends a binary „high‟ signal. Although these neurons
can only give digital output, they have successfully been One of the most exciting characteristics of spiking neural
implemented in ANN like multi layer perceptrons and networks, with the potential to create a step-change in our
Hopfield nets. knowledge of neural computation, is that they are embedded
The next generation used continuous activation in time (Maass 2001). Spike latencies, axonal conduction
function like sigmoid and hyperbolic tangent, making them delays, refractory periods, neuron resonance and network
suitable for analog in- and output, e.g. feed-forward and oscillations all give rise to an intrinsic ability to process time-
recurrent neural networks. Neuron models of both generations varying data in a more natural and computationally powerful
do not employ individual pulses, but their output typically lies way than is available to 2nd generation models.
between 0 and 1. These signals can be seen as normalized
firing rates of the neuron within a certain period of time called A spiking neuron can be differentiated by traditional artificial
rate coding, where, neuron as [3]:
T
1) Both combinational as well as inter-spike information
High rate of coding ↔ high output signal. can be processed by a spiking neuron.
2) A feedback connection is not required by a spiking
neuron for sequential input learning.
3) The only requirement is local timing reference.
Networks of the earlier generations have proven effective at
ES modeling some cognitive processes and have been successful
in many engineering applications. However the fidelity of
these models with regards to neurophysiologic data is minimal
and this has several drawbacks.
T
This is quite complex class of neuron models, based on the
models into four main classes as follows:
simulation of the intricate behavior of ionic channels. As these
channels open and close, their conductance changes
A. Threshold Fire Models accordingly, yielding a set of differential equations describing
These models are based on the temporal summation of all the process. The variations among the models in this class are
contribution to the membrane potential u(t) received from all mostly due to the choice of channels used and the parameters
ES
presynaptic neurons. If this contribution exceeds a threshold θ,
then the postsynaptic neuron will fire. Two main models could
be considered for this category: Integrate and Fire (I & F)
model and the Spike Response Model (SRM).
of the resulting differential equations.
C. Compartmental modes
Compartmental models better capture the complexity of a
neuron, taking into account the spatial structure of the
dendritic tree and also model the synaptic transmission at a
greater level of detail. With this approach, it is possible to
consider other ion currents, beyond sodium and potassium
T
Fig.6. Configuration of a Perceptron
currents incorporated in Hodgkin-Huxley model.
temporal coding were proposed in [8, 9, 10, 11, 12, 13, and
Rank 14] and their efficiency was found comparable with popular
Count Binary
Spike Coding order Delay coding
coding coding
coding sigmoidal neural networks.
counts the the order the order and An engineering application [15] employs the coincidence
total a binary the delays both
Description
number of number neurons encode
detection property of spiking neurons and a Hebbian based
spikes fire information delay shifting rule. The application considered is that of
control chart pattern recognition. Control chart patterns
indicate the state of a process being monitored and can be
Information
capacity (bits)
log2(n+1) n log2(n!) n.log2(T/p) utilized to detect abnormal behavior of the process. Also
spiking neurons can act as a coincidence detector for incoming
pulses and can detect coincidence of the input signals with
Each of these coding strategies has very different information ease unlike classical neural networks where this is
capacities summarized in the table. In the table, n is the computationally expensive to realize [16].
number of neurons under consideration, T is the time window
over which each neuron can fire either 0 or 1 spikes, and p is The network architecture shown [15] is a simple two – layered
the precision, which is the minimum inter-spike interval which fully connected feed forward network. The input layer has
can be discerned by postsynaptic neurons. neurons equal to number of input parameters. The output layer
is constructed with coincidence detecting spiking neurons.
T
Each feed-forward connection is assigned a random weight
V. SPIKING NEURAL NETWORKS APPLIED TO INTRUSION and a delay value. Connection delays are adapted through a
DETECTION Hebbian-based rule which enables the inputs from a class of
Any attempt to compromise the integrity, confidentiality or inputs to coincide at some group of neurons.
availability of a resource is called an intrusion. Researchers
have developed intrusion detection system for various
ES
environments depending upon the security concerns of
different networks. The function of Intrusion Detection System
[4] is to gather and analyze information from various areas
within a computer or a network to determine all possible
security breaches.
Intrusion detection systems can be of two types: signature
based and anomaly based. Signature detection systems are
based on pattern matching i.e. they try to match the scenarios
with already recorded signatures from the database while
anomaly detection techniques compare the behavior of data
creating a baseline profile of the normal system, any deviation
A
from the normal data is considered to be an anomaly. Both the
approaches have their own pros and cons. In Signature
detection the known attacks can be detected reliably with low
false positive rates but the major drawback is that such systems
require a timely and continuously updated database of
signatures for all possible attacks against a network. On the Fig.7. Pattern Detection by a Spiking Neural Network
other hand, anomaly detection has two major advantages over
IJ
VII. DISCUSSION
T
The figures show how outlined encoding allows for increased REFERENCES
capacity, e.g., by encoding variables with more neurons, many
different clusters can be separated. In Fig.8 input has two [1] Jilles Vreeken, “Spiking Neural Networks, an introduction”, Adaptive
separately encoded variables and found that a network with 24 Intelligence Laboratory, Institute of Information and computing science,
ES
input neurons was easily capable of correctly classifying 17
evenly distributed clusters, demonstrating a significant
increase in the clustering capacity [17]. After presenting 750
randomly chosen data points, all 1275 cluster points were
Utrecht University.
correctly classified, fig.9 shows correct clustering of less [3] T.Ichishita and R.H. Fujii, “Performance evaluation of a Temporal
Sequence Learning Spiking Neural Network”, 7th International Conference on
regular input. Computer and Information technology, pp. 616 – 620, 2007 IEEE.
[7] W.Maass, “Noisy spiking neurons with temporal coding have more
computational power than sigmoidal neurons”, in advances in Neural
Information Processing Systems, volume 9, MIT press, Cambridge, pp. 211 –
217, 1997.
IJ
[8] B.Ruf and M.Schmitt “Self organization of spiking neurons using action
potential timing”, IEEE Transactions on Neural Networks, 9(3), pp. 319 –
332, 1998.
[13] X.Tao and H.E.Michel, “Data clustering via spiking neural networks
through spike timing dependent plasticity”, IC-AI, pp.168 – 173, 2004.
T
ES
A
IJ