You are on page 1of 7

Disha Sharma et al.

/ (IJAEST) INTERNATIONAL JOURNAL OF ADVANCED ENGINEERING SCIENCES AND TECHNOLOGIES


Vol No. 2, Issue No. 1, 036 - 042

A Novel Approach to Network Intrusion Detection


using Spiking Neural Networks.
Disha Sharma* Veenu Mangat
Student M.E. (I.T.) 2nd yr. Assistant Professor (I.T.)
University Institute of Engineering and Technology University Institute of Engineering and Technology
Panjab University Panjab University
Chandigarh, India Chandigarh, India
dishasharma210@gmail.com veenumangat@yahoo.com

Disha Sharma
dishasharma210@gmail.com effectively explored in the molecular level. The research in
+91-7837448470 molecular biology contributed to the knowledge about ion
channels and receptors, two important elements in neural

T
Abstract - Although the nervous system and nervous cells have signaling, making it possible to describe the first molecular
been studied since ancient times, almost all the important structure of an ionic channel [KSJ00]. This enhanced capacity
advances on the knowledge base took place along the past of neuronal modeling came to shed some light into questions
century, when neuroscience evolved and developed more complex like: How do the nervous cells communicate among them?
and detailed neuron models. However, engineering applications How is this communication modified by experience? How
were – and still are – practically limited to the basic neuron
models and its variations. In traditional artificial neural
different interconnection patterns originate different
ES
networks, the neuron behavior is described only in terms of firing
rate, while most real neurons, commonly known as spiking
neurons, transmit information by pulses, also called action
potentials or spikes. From these considerations a major question
perceptions and motor actions?

From engineering point of view, it is clear that answers for all


these questions will only be possible with a deeper
raises immediately: if we were able to build powerful applications comprehension of the biological neuron and how they can do
in all fields of engineering using these simple models, what would fast and reliable computation [MB98]. Although much
it be possible to do with the more complex models? This question progress has been achieved in the last two decades, there are
is the fundamental motivation of the present work. Given the still a few fundamental questions, like how do real neurons
importance of more realistic neuron models, our main objective
is to present a general and comprehensive overview of spiking
transmit information or how to use the spike timing efficiently
neurons, ranging from biological neuron features to examples of to process information [Nat96, GK02b].
practical applications. The aim of the present work is therefore to
The present knowledge of the nervous system has reached
A
highlight what we believe will be one of the main components of
the future computing machines: the spiking neuron. Besides the an enormous level of detail, making it impossible even to
review itself, we present also a novel approach to the spiking summarize. Therefore, we limited ourselves to mention only
neuron network architecture used in intrusion detection. the main components with a brief description of its basic
Keywords- Spiking Neural Networks, Intrusion detection, data properties. We concisely present some of the most used
clustering, time-varying inputs, Spike coding. theoretical neuron models, from the simple ones, like the
integrate-and-fire, to the more complex ones, like the
IJ

I. BACKGROUND compartmental models. In real nervous systems there is a wide


range of neuron types, each one assigned to do a specific
Although the nervous system and nervous cells have been
function, making it virtually impossible to create a model that
studied since ancient times, almost all the important advances
meets all the requirements.
on the knowledge base took place were founded on five
experimental disciplines: anatomy, embryology, physiology,
In this work we intend to make a re view on spiking
pharmacology, and psychology [KSJ00]. Then, beginning in
neurons, presenting a comprehensive scenario of the
the middle of the past century, engineering was added as a
foundations and its application to intrusion detection using
sixth discipline and, reciprocally, neuroscience was also
clustering methods.
adopted by engineering, ensuing all the development of
computational intelligence and making this subject a rather
interdisciplinary one.

In the last ten years all the main questions formerly addressed
by neuroscientists only to cellular biology began to be

ISSN: 2230-7818 @ 2011 http://www.ijaest.iserp.org. All rights Reserved. Page 36


Disha Sharma et al. / (IJAEST) INTERNATIONAL JOURNAL OF ADVANCED ENGINEERING SCIENCES AND TECHNOLOGIES
Vol No. 2, Issue No. 1, 036 - 042

II. INTRODUCTION Figure.2. shows a spiking neuron with multiple synapse inputs
Decades passed by since the introduction of Artificial neural [2]. The inputs comprises of a sequence of unit amplitude
networks (ANN), the fairly old technique is still on the path of digital spikes having a very short duration. A sequence of
improvement coming up with new generations of neuron inputs entering a synapse is called a spike train. The inter-
models. McCulloch-Pitts gave the first generation of ANN spike times and their sequence order form the input and the
based on a simple threshold neuron model: if the sum of its response function is based on Leaky Integrator Fire Neuron
weighted incoming signals rises above a threshold value, a (LIFN) model.
neuron sends a binary „high‟ signal. Although these neurons
can only give digital output, they have successfully been One of the most exciting characteristics of spiking neural
implemented in ANN like multi layer perceptrons and networks, with the potential to create a step-change in our
Hopfield nets. knowledge of neural computation, is that they are embedded
The next generation used continuous activation in time (Maass 2001). Spike latencies, axonal conduction
function like sigmoid and hyperbolic tangent, making them delays, refractory periods, neuron resonance and network
suitable for analog in- and output, e.g. feed-forward and oscillations all give rise to an intrinsic ability to process time-
recurrent neural networks. Neuron models of both generations varying data in a more natural and computationally powerful
do not employ individual pulses, but their output typically lies way than is available to 2nd generation models.
between 0 and 1. These signals can be seen as normalized
firing rates of the neuron within a certain period of time called A spiking neuron can be differentiated by traditional artificial
rate coding, where, neuron as [3]:

T
1) Both combinational as well as inter-spike information
High rate of coding ↔ high output signal. can be processed by a spiking neuron.
2) A feedback connection is not required by a spiking
neuron for sequential input learning.
3) The only requirement is local timing reference.
Networks of the earlier generations have proven effective at
ES modeling some cognitive processes and have been successful
in many engineering applications. However the fidelity of
these models with regards to neurophysiologic data is minimal
and this has several drawbacks.

 Neurophysiologic knowledge cannot be integrated


easily into the models and as such cannot be tested
for applicability to or effect upon neural computation.
Fig.1. Artificial Neuron model
 Real neurons exhibit a very broad range of behaviors
(tonic (continuous) and phasic (once-off) spiking,
The third generation [1] raises the level of biological realism bursting, spike latency, spike frequency adaptation,
A
by using individual spikes, incorporating spatial-temporal resonance, threshold variability, input
information in communication and computation like real accommodation and bistability (Izhikevich 2004)).
neurons. So instead of using rate coding these neurons use It's unlikely that these behaviors have no
pulse coding mechanism where neurons receive and do send computational significance.
out individual pulses, allowing multiplexing of frequency and
amplitude of sound.  There are specific interesting processes occurring at
the spike level (such as Spike Timing Dependent
IJ

Plasticity (Bi and Poo 1998)) that cannot be modeled


without spikes.

 The dynamics of spiking networks are much richer,


allowing for example

o oscillations in network activity which could


implement multiple concurrent processing
streams (Izhikevich 1999), figure/ground
segmentation and binding (Csibra, Davis et
al. 2000; Engel, Fries et al. 2001), short term
memory (Jensen, Idiart et al. 1996; Jensen,
Fig.2. Spiking Neuron with spiking train inputs
Gelfand et al. 2002) etc

ISSN: 2230-7818 @ 2011 http://www.ijaest.iserp.org. All rights Reserved. Page 37


Disha Sharma et al. / (IJAEST) INTERNATIONAL JOURNAL OF ADVANCED ENGINEERING SCIENCES AND TECHNOLOGIES
Vol No. 2, Issue No. 1, 036 - 042

o much increased (perhaps by orders of 2) Spike Response Model (SRM)


magnitude) memory capacity (Izhikevich
2005). Transmission delays are very The SRM model describes the state of a neuron by a single
significant for computation particularly variable, the membrane potential. It expresses the membrane
because they are random or Gaussian for potential Vi at time t as an integral over the past, including a
real neurons - this causes the formation of model of refractoriness. The SRM is phenomenological model
polychronous (as against synchronous) of neuron, based on the occurrence of spike emissions. Before
spiking neuron groups which could possibly any input spike has arrived at the postsynaptic neuron i, the
store many more population-encoded variable Vi (t) has a value 0. The firing of a presynaptic neuron
memories than there are synaptic weights. j at time tj(f) evokes a postsynaptic potential in the neuron i
This idea is still to be fully researched and modeled by the response. Each incoming spike will perturb
analyzed. the value of Vi and if, after the summation of the inputs, the
membrane potential Vi reaches the threshold then an output
spike is generated. After the neuron has fired the membrane
III. SPIKING NEURON MODELS potential returns to a low value which is described by the
A spiking neuron model accounts for the impact of impinging refractory period function.
action potentials – spikes – on the targeted neuron in terms of
the internal state of the neuron, as well as how this state relates B. Conductance based models
to the spikes the neuron fires. We divide spiking neuron

T
This is quite complex class of neuron models, based on the
models into four main classes as follows:
simulation of the intricate behavior of ionic channels. As these
channels open and close, their conductance changes
A. Threshold Fire Models accordingly, yielding a set of differential equations describing
These models are based on the temporal summation of all the process. The variations among the models in this class are
contribution to the membrane potential u(t) received from all mostly due to the choice of channels used and the parameters
ES
presynaptic neurons. If this contribution exceeds a threshold θ,
then the postsynaptic neuron will fire. Two main models could
be considered for this category: Integrate and Fire (I & F)
model and the Spike Response Model (SRM).
of the resulting differential equations.

1) Hodgkin Huxley model


Hodgkin and Huxley modeled the electro-chemical
1) Integrate-and-fire Model – I & F information transmission of natural neurons with electrical
The simplest (I&F) model was originally developed when circuits consisting of capacitors and resistors. Based on the
dominant thinking stated that neuron function can be well- experiments three types of ionic currents are found namely
enough approximated by simply integrating input and then sodium (Na), potassium (K), and a leak current. The first two
firing at a given threshold. Properties like spike frequency types are controlled by specific voltage dependent ion
adaptation, bursting, resonance, latency and variable channels. The third type takes care of other channels. The cell
thresholds were incorporated into models as needs arose and membrane is a good insulator and acts as a capacitor. Besides
A
thinking changed. However unfortunately no single I&F the capacitor, there are three resistances, one for each ion
model displays all these characteristics. The basic model is channel considered in the model.
also called Leaky Integrate and Fire because the membrane is
assumed to be leaky due to ion channels, such that after a The conservation of electric charge on a piece of membrane
PostSynaptic Potential (PSP) the membrane potential implies that the applied current I(t) may be split in a capacitive
approaches again a reset potential urest. current IC which charges the capacitor C and further
IJ

components Ik which pass through the ion channels. Thus


Threshold
Output
V in
I(t) = IC(t) + Ik(t)
Cm where the sum runs over all ion channels.
Reset
From the definition of a capacity C = Q/u where Q is a charge
and u the voltage across the capacitor, we find the charging
current IC = C du/dt. Hence

Fig. 3.Integrate and Fire model


C =- Ik(t) + I(t) .

ISSN: 2230-7818 @ 2011 http://www.ijaest.iserp.org. All rights Reserved. Page 38


Disha Sharma et al. / (IJAEST) INTERNATIONAL JOURNAL OF ADVANCED ENGINEERING SCIENCES AND TECHNOLOGIES
Vol No. 2, Issue No. 1, 036 - 042

is described only in terms of spike rate. A good example of


rate model is the Perceptrons.

Fig.4. Schematic diagram of Hodgkin-Huxley model

C. Compartmental modes
Compartmental models better capture the complexity of a
neuron, taking into account the spatial structure of the
dendritic tree and also model the synaptic transmission at a
greater level of detail. With this approach, it is possible to
consider other ion currents, beyond sodium and potassium

T
Fig.6. Configuration of a Perceptron
currents incorporated in Hodgkin-Huxley model.

IV. SPIKE CODING


Spiking neurons encode information through their average
ES spike rate over some time window called a rate code. Sensory
cells such as in the cochlear and the retina use a rate code
(Izhikevich 2005), however response time in the visual cortex
is known to be too fast to continue processing with this coding
regime (Thorpe, Fize et al. 1996; Thorpe, Delorme et al. 2001)
– each neuron in the visual processing hierarchy only has time
to fire one or occasionally two spikes prior to recognition, so
clearly the visual cortex cannot be using a rate code, but is
instead somehow utilizing the presence and/or the timing of
spikes called a temporal code.
Fig.5. A generic equivalent circuit of a neural compartment
There are a number of different coding strategies possible
A
using spike times, shown below in increasing order of
information encoding capacity (Thorpe, Delorme et al. 2001).
The basic idea is to divide the components into smaller
 Count coding: counts the total number of spikes of a
uniform components or compartments. Each compartment is
neuron population in a given time – similar to rate
then modeled with equations describing the equivalent
coding except it entails one spike from each of many
electrical circuit. The use of appropriate differential equations
neurons instead of many spikes from one neuron;
for each compartment enables the simulation of their behavior
IJ

however the information capacity of rate coding is


as well as their interactions with other compartments. The
the same (very small).
notion of an equivalent electrical circuit for a small piece of
cellular membrane is the basis for all compartmental models.  Binary coding: encodes a binary number, each digit
represented by the presence (1) or absence (0) of a
spike.
D. Rate models  Rank order coding: the order of firing of the
neurons encodes the information.
The rate models, also known as sigmoidal units, are the most
 Delay coding: type of temporal coding, to determine
traditional and widely used models for the analysis of learning
time-slice length between minimum and maximum
and memory in ANNs. The first choice to be made before the
activation levels.
construction of a neuronal model is the level of abstraction
and complexity of the model. The class of rate models
represents the highest abstraction level, as they neglect the
pulse structure of the neuronal systems and the neural activity

ISSN: 2230-7818 @ 2011 http://www.ijaest.iserp.org. All rights Reserved. Page 39


Disha Sharma et al. / (IJAEST) INTERNATIONAL JOURNAL OF ADVANCED ENGINEERING SCIENCES AND TECHNOLOGIES
Vol No. 2, Issue No. 1, 036 - 042

temporal coding were proposed in [8, 9, 10, 11, 12, 13, and
Rank 14] and their efficiency was found comparable with popular
Count Binary
Spike Coding order Delay coding
coding coding
coding sigmoidal neural networks.

counts the the order the order and An engineering application [15] employs the coincidence
total a binary the delays both
Description
number of number neurons encode
detection property of spiking neurons and a Hebbian based
spikes fire information delay shifting rule. The application considered is that of
control chart pattern recognition. Control chart patterns
indicate the state of a process being monitored and can be
Information
capacity (bits)
log2(n+1) n log2(n!) n.log2(T/p) utilized to detect abnormal behavior of the process. Also
spiking neurons can act as a coincidence detector for incoming
pulses and can detect coincidence of the input signals with
Each of these coding strategies has very different information ease unlike classical neural networks where this is
capacities summarized in the table. In the table, n is the computationally expensive to realize [16].
number of neurons under consideration, T is the time window
over which each neuron can fire either 0 or 1 spikes, and p is The network architecture shown [15] is a simple two – layered
the precision, which is the minimum inter-spike interval which fully connected feed forward network. The input layer has
can be discerned by postsynaptic neurons. neurons equal to number of input parameters. The output layer
is constructed with coincidence detecting spiking neurons.

T
Each feed-forward connection is assigned a random weight
V. SPIKING NEURAL NETWORKS APPLIED TO INTRUSION and a delay value. Connection delays are adapted through a
DETECTION Hebbian-based rule which enables the inputs from a class of
Any attempt to compromise the integrity, confidentiality or inputs to coincide at some group of neurons.
availability of a resource is called an intrusion. Researchers
have developed intrusion detection system for various
ES
environments depending upon the security concerns of
different networks. The function of Intrusion Detection System
[4] is to gather and analyze information from various areas
within a computer or a network to determine all possible
security breaches.
Intrusion detection systems can be of two types: signature
based and anomaly based. Signature detection systems are
based on pattern matching i.e. they try to match the scenarios
with already recorded signatures from the database while
anomaly detection techniques compare the behavior of data
creating a baseline profile of the normal system, any deviation
A
from the normal data is considered to be an anomaly. Both the
approaches have their own pros and cons. In Signature
detection the known attacks can be detected reliably with low
false positive rates but the major drawback is that such systems
require a timely and continuously updated database of
signatures for all possible attacks against a network. On the Fig.7. Pattern Detection by a Spiking Neural Network
other hand, anomaly detection has two major advantages over
IJ

signature detection. First, ability to detect unknown attacks as


well as “zero day” attacks. Second, every network has its own
customized profiles of normal activity, which makes it difficult VI. CLUSTERING SPIKING NEURAL NETWORKS
for the attacker to know with confidence what activities can be
carried out without getting detected. Clustering relies on a single output neuron firing earlier than
the other output neurons for data points from a single cluster.
Hopfield [5] introduced the idea of using the timing action The optimal activation of such an output neuron is achieved
potentials to represent the values for computation within a when the spikes of input arrive at the output neuron
network. Maass [6] showed that a network of spiking neurons simultaneously.
can simulate arbitrary feed-forward sigmoidal networks and
can approximate any continuous functions. Spiking neural
networks [7] which convey information by individual spike
times are more computationally expressive than networks with
sigmoidal activation. Many learning algorithms for SNNs with

ISSN: 2230-7818 @ 2011 http://www.ijaest.iserp.org. All rights Reserved. Page 40


Disha Sharma et al. / (IJAEST) INTERNATIONAL JOURNAL OF ADVANCED ENGINEERING SCIENCES AND TECHNOLOGIES
Vol No. 2, Issue No. 1, 036 - 042

VII. DISCUSSION

With the knowledge we are currently obtaining of the


fundamental importance of spike timings and oscillations to
neural processing, 2nd generation ANNs can no longer provide
a viable basis for neural modeling. Spiking Neural Networks
present many new challenges but also afford many new
opportunities for breaking entirely new ground in artificial
intelligence research. Apart from various advantages posed by
SNNs there still remain few bottlenecks: biologically realistic
spiking models have required intensive computations for even
small amounts of simulated time, making simulations of large
networks or long time periods impractical in most situations.
One of the advantages may also be a disadvantage in that the
complex behavior needs to be understood and effectively
managed. Also much less is known about networks of spiking
Fig.8. Some 17 clusters in 2-D space, represented by two 1-D input
neurons than the more established ANN paradigms, and many
variables, each variable encoded by 12 neurons (five broadly tuned, seven well-accepted methodologies need to be adapted or possibly
sharply tuned). replaced.

T
The figures show how outlined encoding allows for increased REFERENCES
capacity, e.g., by encoding variables with more neurons, many
different clusters can be separated. In Fig.8 input has two [1] Jilles Vreeken, “Spiking Neural Networks, an introduction”, Adaptive
separately encoded variables and found that a network with 24 Intelligence Laboratory, Institute of Information and computing science,
ES
input neurons was easily capable of correctly classifying 17
evenly distributed clusters, demonstrating a significant
increase in the clustering capacity [17]. After presenting 750
randomly chosen data points, all 1275 cluster points were
Utrecht University.

[2] W.Gerstner and W.Kistler, “Spiking neuron models”, Cambridge


University press, 2002.

correctly classified, fig.9 shows correct clustering of less [3] T.Ichishita and R.H. Fujii, “Performance evaluation of a Temporal
Sequence Learning Spiking Neural Network”, 7th International Conference on
regular input. Computer and Information technology, pp. 616 – 620, 2007 IEEE.

[4] Animesh Patcha, Jung-Min Park, “An overview of anomaly detection


techniques: existing solution and latest technological trends”, Computer
Networks 51(2007), pp.3448-3470, February 2007.

[5] J.J.Hopfield, “Pattern recognition computation using action potential


timing for stimulus representation”, Nature, 376, pp. 33 – 36, 1999.
A
[6] W.Maass, “Fast sigmoidal networks via spiking neurons”, Neural
computation, 9(2), pp.279 – 304, 1999.

[7] W.Maass, “Noisy spiking neurons with temporal coding have more
computational power than sigmoidal neurons”, in advances in Neural
Information Processing Systems, volume 9, MIT press, Cambridge, pp. 211 –
217, 1997.
IJ

[8] B.Ruf and M.Schmitt “Self organization of spiking neurons using action
potential timing”, IEEE Transactions on Neural Networks, 9(3), pp. 319 –
332, 1998.

[9] S.M.Bohte, J.N.Kok and H.La Poutre, “Error back propagation on


Fig.9. Classification of ten irregularly spaced clusters. For reference,
temporally encoded networks of spiking neurons”, Neurocomputing, 48, pp.
the different classes as visually extractable were all correctly clustered,
17 - 37, 2002.
as indicated by the symbol/graylevel coding.
[10] B.Ruf and M.Schmitt, “Learning temporally encoded patterns in
Likewise clustering could be used for detecting abnormal networks of spiking neurons”, Neural Processing Letters, 5(1), pp. 9 – 18,
behavior of data points indicating it to be an anomaly. The 1997.
data having similar characteristics fall in the same cluster
following the principle of inter-cluster similarity and intra- [11] T.Natschlager and B.Ruf, “Spatial and temporal pattern analysis via
spiking neurons”, Network: Computational Neural Systems, 9(3), pp.319 –
cluster dissimilarity. The data points that do not fall into any 332, 1998.
cluster are considered as anomalous candidates.

ISSN: 2230-7818 @ 2011 http://www.ijaest.iserp.org. All rights Reserved. Page 41


Disha Sharma et al. / (IJAEST) INTERNATIONAL JOURNAL OF ADVANCED ENGINEERING SCIENCES AND TECHNOLOGIES
Vol No. 2, Issue No. 1, 036 - 042

[12] S.M.Bohte, H.La Poutre and J.N.Kok, “Unsupervised clustering with


spiking neurons by sparse temporal coding and multi layer RBF networks”,
IEEE Transactions on Neural Networks, 13(2), pp.426 – 435, 2002.

[13] X.Tao and H.E.Michel, “Data clustering via spiking neural networks
through spike timing dependent plasticity”, IC-AI, pp.168 – 173, 2004.

[14] D.T.Pham, M.S.Packianather, E.Y.A.Charles, “A novel self organized


learning model with temporal coding for spiking neural networks”, Intelligent
Production machines and systems, pp.307 – 312, 2006.

[15] D.T.Pham, M.S.Packianather, E.Y.A.Charles, “A self organizing spiking


neural network trained using delay adaptation”, MEC, Cardiff University, UK,
pp. 3441 – 3446, 2007.

[16] W.Maass, “Computing with spiking Neurons”, in Pulsed Neural


Networks, The MIT Press, Cambridge, pp. 55 – 85, 2001.

[17] Sander M. Bohte, Han La Poutré, and Joost N. Kok, “Unsupervised


Clustering With Spiking Neurons By Sparse Temporal Coding and Multilayer
RBF Networks”, IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL.
13, NO. 2, pp. 426-435, 2002.

T
ES
A
IJ

ISSN: 2230-7818 @ 2011 http://www.ijaest.iserp.org. All rights Reserved. Page 42

You might also like