Professional Documents
Culture Documents
discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/221533343
CITATIONS READS
4 33
2 authors:
Some of the authors of this publication are also working on these related projects:
Organization of Semantic and Episodic Memory in Motivated Learning of Robots View project
All content following this page was uploaded by James T. Graham on 29 May 2014.
Abstract—This paper examines the neural gas networks by Cheng and Zell [7] which presents a double growing
proposed by Martinetz and Schulten [1] and Fritzke [2] in neural gas network for disease diagnosis. In the double
an effort to create a more biologically plausible hybrid network, the network’s growth is accelerated by doubling
version. The hybrid algorithm proposed in this work retains
the number of neurons that are inserted during each
most of the advantages of the Growing Neural Gas (GNG)
insertion stage. Other example of the applications of
algorithm while adapting a parameterless and more
biologically plausible design. It retains the ability to place neural gas networks includes a method for training vector
nodes where needed as in the GNG algorithm without quantizers by Rovetta and Zunino [8] and a method for
actually having to introduce new nodes. Also, by removing assessing the stability of electric power systems from
the weight and error adjusting parameters the guesswork Rehtanz and Leder [9].
required to determine parameters is eliminated. While the While the preceding articles give a good overview of
hybrid algorithm performs admirably in terms of the quality the applications of growing neural gas networks, they do
of results when compared to Fritzke’s algorithm, it is slightly not offer comparisons to other neural network types. The
slower due to the greater computational overhead. However,
paper by Heinke and Hamker [10] addresses this issue by
it is more biologically feasible and somewhat more flexible
comparing the growing neural gas network with such
due to its hybrid nature and lack of reliance on adjustment
parameters. networks as the growing cell structures network (GCS),
fuzzy ARTMAP (fuzzy predictive adaptive resonance
I. INTRODUCTION theory) , and the standard multilayer perceptron network
(MLP). They tested four different real world data sets
The neural gas network is a biologically inspired with the four types of networks and produced varying
ontogenic network. It is an unsupervised network that results. The results showed that no network type
operates on a few central principles. A certain number of performed significantly better on all four of the data sets
neurons or nodes (usually two) are selected by how far than the others. However, they concluded that when
they are from an input signal, then they are adapted taking performance, convergence time, and dependence
according to a fixed algorithm/schedule determined by the on parameters into account the networks could be ranked
algorithm’s designer. The networks discussed in this in the following order: GNG, GCS, MLP, and fuzzy
paper are referred to as “neural gas” structures because of ARTMAP. On the other hand, when there are linear
their similarity to the original neural gas network as it was boundaries between classes, fuzzy ARTMAP and MLP
first proposed by Martinetz and Schulten, in 1991 [1], and networks tend to perform better.
later expanded upon by Bernd Fritzke [2], [3]. Fritzke has Included in this research are discussions on the
produced several articles detailing many variations of topology finding properties of neural gas networks and
neural gas networks and their properties. their ability to automatically construct radial basis
Since then, there was an increasing interest in neural gas function networks [2,3,11]. The basis for the neural gas
related research as an alternative to Kohonen self- network is the traditional Self-Organizing Map (SOM)
organizing maps [4] as evidenced by a number of papers originally proposed by Kohonen [4]. However, it differs
that took advantage of the neural gas flexibility to fit from the SOM in the way in which the neural network
various topologies. For example, Montes et. al. presented topology is established. While in a SOM each neuron’s
a hybrid ant-based clustering algorithm that made use of neighborhood is fixed, in a GNG both the number of
growing neural gas networks [5]. Also, Rodriguez neurons and their neighborhood change dynamically. This
presented an application of the growing neural gas model produces networks capable of fitting any distribution of
for automatically building statistical models of non-rigid the training data with great ease and accuracy. Yet, the
objects such as hands [6]. Another example of research resulting mechanism is not biologically feasible and the
based on the growing neural gas network is the work done method requires setting a number of arbitrary parameters,
____________________________________________ which makes its effective use difficult.
Authors are with the School of Electrical Engineering and
This paper proposes a form of hybrid of standard SOM
Computer Science, Ohio University, Athens, OH, USA. (email:
James.Graham.1@ohio.edu and starzyk@bobcat.ent.ohiou.edu. and Growing Neural Gas (GNG) networks. This is done
This work was supported in part by a grant form AFOSR by taking the general structure of the SOM and adding
properties of the neural gas network. Specifically, the 6) Move s1 and its direct topological neighbors
ability to insert nodes/neurons where needed in the SOM. toward ξ by fractions εb and εb, with regard to
In Fritzke’s neural gas network, the algorithm grows the the total distance.
network every x cycles by inserting a new node between
the two nodes with the greatest error. The new algorithm
(
!ws1 = " b # $ ws1 )
proposed in this work operates in a similar manner, !wi = " n (# $ wi )
however, rather than adding a new neuron, it takes one 7) Increment the age of all edges emanating from s1
from the region of least error, where it is least needed. 8) Remove edges with age larger than amax. If this
This may have the effect of balancing the network and results in units having no edges, remove them as
potentially reducing the training time. Furthermore, the well.
proposed network adapts a largely parameter-less model 9) If λ signal generation iterations have occurred
similar to the one created by Berglund [12], although
execute this step:
utilizing slightly different methods to achieve the lack of
a. Determine the unit q with the maximum
parameters.
error.
The next section presents Fritzke’s algorithm on which
b. Place a new unit r half way between q
the proposed algorithm is based. This is followed by
and its nearest neighbor f.
section three which presents the proposed hybrid
c. Decrease the error of q and f.
algorithm and compares it to Frizke’s algorithm in detail.
!Eq, f = "# Eq, f
Further comparisons to existing algorithms are presented
in section four, while section five presents some testing d. Determine the error of r with:
results regarding the proposed algorithm. The remaining (
Er = 0.5 Eq + E f )
sections consist of a short discussion of future work and
10) Decrease the error variable of all units:
conclusions.
!Ec = " # Ec
II. BACKGROUND 11) If the stopping criterion is not yet fulfilled
(number of iterations or some other performance
Fritzke’s model differs from the standard neural gas measure) return to step 2.
network largely from the fact that it grows itself. His
algorithm produced good results and served as a basis for Such an algorithm produces a network of neurons
the neural gas algorithm proposed in this work. It was distributed over the field of training data, and has similar
chosen due to its ability to dynamically adjust itself and properties to a SOM network. Namely, such a network
relative simplicity of its network structure. In general, the spreads its neurons over the areas that contain training
GNG is a useful network type because it can easily depict data which facilitates data clustering, labeling, and
topological relations, and form separate clusters of data as classification tasks.
needed. Furthermore, insertion criterion can be arbitrarily
chosen, allowing for use with supervised learning III. HYBRID SELF-ORGANIZING NN ALGORITHM
applications.
To make comparisons between the two methods easier, Fritzke’s algorithm was carefully examined and several
Fritzke’s algorithm is described first. In his paper [2], potentially undesirable features were noted. The
Fritzke described the growing neural gas method detailed following algorithm was created as an attempt to address
as follows: the aforementioned undesirable features. The actual
changes to Fritzke’s approach and the reasons behind
Fritzke’s GNG Algorithm them are discussed following the algorithm.
1) Start with a set A of two units a and b at random
positions wa and wb in Rn: Proposed Hybrid Algorithm
A = {(a, b)}. 1) Generate a randomly positioned & connected
network of neurons of size λ, with η>=1
2) Generate an input signal ξ according to P(ξ) (the
connections per neuron. Set all edge ages to 1.
probability density function of the training data).
Set all node usage values to 0. Usage indicators
3) Determine s1 and s2 (s1, s2 ∈ A), such that they keep track of when a node was last used.
are the two nearest neighbors to ξ. 2) Generate an input signal ξ according to P(ξ).
4) If it does not already exist, insert a connection 3) Determine s1 and s2 (s1, s2 ∈ A) such that they are
between s1 and s2. Regardless, set the age of the
the two nearest neighbors to ξ.
connection between the two to zero.
5) Determine the error delta for s1 using: 4) If it does not already exist, insert a connection
2 between s1 and s2. Regardless, increment the age
!Es1 = ws1 " # of the connection between the two, and set the
usage indicator the current cycle. (If a new
connection the age is one.)
Ek = " FE i i 2 , then adjust errors of
i!N k