Professional Documents
Culture Documents
net/publication/229901451
CITATIONS READS
10 249
3 authors:
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
Versatile 4-ports built with adder and subtractor circuits suitable for Memristor, Memcapacitor and Meminductor realizations View project
All content following this page was uploaded by I. Cem Goknar on 29 January 2018.
SUMMARY
This paper presents a cellular neural network (CNN) scheme employing a new non-linear activation
function, called trapezoidal activation function (TAF). The new CNN structure can classify linearly
non-separable data points and realize Boolean operations (including eXclusive OR) by using only a
single-layer CNN. In order to simplify the stability analysis, a feedback matrix W is dened as a
function of the feedback template A and 2D equations are converted to 1D equations. The stability
conditions of CNN with TAF are investigated and a sucient condition for the existence of a unique
equilibrium and global asymptotic stability is derived. By processing several examples of synthetic
images, the analytically derived stability condition is also conrmed. Copyright ? 2005 John Wiley &
Sons, Ltd.
KEY WORDS: cellular neural network; Lyapunov stability criterion; non-linear activation function;
XOR operation; linearly separable
1. INTRODUCTION
Cellular neural networks (CNNs) are widely used in image processing and pattern recognition
elds [1–5]. CNN is a large-scale non-linear processing array consisting of, unlike most other
neural networks, only locally interconnected cells; this facilitates its analysis, design and im-
plementation with VLSI circuits. In image processing applications, each cell in CNN represents
a pixel in the image. Stability issues of CNN are investigated in many papers. Some of these
are: (i) that CNN is stable if the templates are symmetrical is proven in References [4, 5],
(ii) the focus is on the dynamic behaviour of two-cell CNN in References [6, 7], (iii) sta-
bility conditions of generalized CNNs are given in Reference [8], (iv) in References [4–8],
∗ Correspondence to: Erdem Bilgili, TUBITAK, Gebze Yerleskesi, Marmara Arastrma Merkezi, Malzeme Enstitusu,
P.K.: 21, 41470 Gebze, KOCAELI, Turkey.
† E-mail:erdem.bilgili@mam.gov.tr
‡ E-mail:cgoknar@dogus.edu.tr
§ E-mail:uosman@istanbul.edu.tr
CNN stability is analysed for the standard activation function as in References [4, 5], (v) global
exponential stability conditions of CNN via a new Lyapunov function are stated in
Reference [9]. It is well known that the standard uncoupled CNN single-layer structures,
extremely useful for realizing Boolean functions, are not capable of classifying linearly non-
separable data. The parity is a binary function of the inputs, which returns a high output if the
number of inputs set to 1 is odd and a low output if that number is even. Therefore, for n in-
puts, the parity problem consists of being able to divide the n-dimensional input space into dis-
joint decision regions such that all input patterns in the same region yield the same output and,
thus is linearly non-separable. Uncoupled CNN can only classify linearly separable data, that
is can only separate the input space with hyper-planes [10]. Recently, a single perceptron-like
cell with: (i) double threshold, (ii) implemented using only ve MOS transistors, (iii) capable
of classifying data which are not linearly separable has been reported in References [11, 12].
In Section 1 of this paper, the denition of standard CNN with piecewise-linear (PWL)
(CNNwPWL) is briey reviewed, then the proposed trapezoidal activation function (TAF)
and CNN with TAF (CNNwTAF), which is a generalization of CNN with double threshold,
are introduced. Stability analysis of CNNwTAF is achieved in Section 2: rst by converting
the 2D template description of the CNNwTAF into a system of vector ordinary dierential
equations (VODE), second by applying Brouwer xed-point theorem to VODE to prove that
CNNwTAF has an equilibrium point, and nally by applying the Lyapunov stability criterion
to extract a sucient condition for global asymptotic stability. In Section 3, the stability
conditions derived in the previous section are conrmed by simulating CNNwTAF with several
templates and TAF parameters then, applying each to several images. Furthermore, it is also
shown that a single-layer CNNwTAF can be used for performing non-separable tasks such as
exclusive OR (XOR) and parity problems.
dX
= − X + A∗ Y + B ∗ U + I (1)
dt
with the operation * in conventional form meaning:
2r+1
2r+1
+ Bkl ui+k−r−1; j+l−r−1 (t) + I for i = 1; 2; : : : ; m; j = 1; 2; : : : ; n (2)
k=1 l=1
where m and n represent the number of rows and columns of the CNN, and uij , xij and yij
denote the input, state and output of the cell C(i; j), respectively. The templates A; B composed
of the weights Aij and Bij (as shown in Equation (4)) denote the feedback and feed-forward
templates, respectively. These templates have equal size which depends on the predened
neighbourhood radius of the CNN. The term I , called the oset (bias), is a constant for
each cell.
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
CELLULAR NEURAL NETWORK WITH TRAPEZOIDAL ACTIVATION FUNCTION 395
For a CNN with r-neighbourhood, it is clear that A and B have size (2r + 1) × (2r + 1).
Note that, yi+k−r−1; j+l−r−1 = ui+k−r−1; j+l−r−1 = 0, if one of the following conditions which
limits the CNN structure, holds:
(i) i + k − r − 1¿m; (ii) i + k − r − 1 ¡ 1; (iii) j + l − r − 1¿n; (iv) j + l − r − 1¡1.
Depending on the neighbourhood radius r, the feedback template A can be written as
⎡ ⎤
A1; 1 ··· A1; r A1; r+1 A1; r+2 ··· A1; 2r+1
⎢ ⎥
⎢ . .. .. .. .. . .. ⎥
⎢ .. . . . ⎥
⎢ . . . . ⎥
⎢ ⎥
⎢ ⎥
⎢ Ar; 1 ··· Ar; r Ar; r+1 Ar; r+2 ··· Ar; 2r+1 ⎥
⎢ ⎥
⎢ ⎥
A=⎢⎢ Ar+1; 1 · · · Ar+1; r Ar+1; r+1 Ar+1; r+2 · · · Ar+1; 2r+1 ⎥ ⎥ (4)
⎢ ⎥
⎢A Ar+2; r+2 · · · Ar+2; 2r+1 ⎥
⎢ r+2; 1 · · · Ar+2; r Ar+2; r+1 ⎥
⎢ ⎥
⎢ ⎥
⎢ .. . .. .. .. .. .. ⎥
⎢ . .. . . . . . ⎥
⎣ ⎦
A2r+1; 1 · · · A2r+1; r A2r+1; r+1 A2r+1; r+2 · · · A2r+1; 2r+1
In standard CNN literature, the relation between the output and the state of the cell is
dened by a sigmoid activation function [4, 5], its standard PWL version, which is dened
by Equation (5), is illustrated in Figure 1
yij (t) = 12 (|xij (t) + 1| − |xij (t) − 1|) (5)
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
396
E. BILGILI, İ. C. GOKNAR AND O. N. UCAN
In the case of arbitrary r-neighbourhood, 16r 6 min{m; n}, where m, n represent number of
rows and columns of the CNN, Equation (2) can be rewritten as
dxij (t) 2r+1
2r+1
= − xij (t) + Akl f(xi+k−r−1; j+l−r−1 (t)) + sij (7a)
dt k=1 l=1
where
2r+1
2r+1
sij = Bkl :ui+k−r−1; j+l−r−1 (t) + Iij (7b)
k=1 l=1
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
CELLULAR NEURAL NETWORK WITH TRAPEZOIDAL ACTIVATION FUNCTION 397
The term sij is a dierent constant value for each cell, as the inputs uij are constant, similarly
the terms Bkl and Iij = I are constant. The activation function f used in Equation (7) is the
TAF dened by Equation (6). As there are m × n cells in a CNN, there are m × n states in
total. For the purpose of describing the behaviour of the CNN by a system of VODE as in
Reference [10], a map, which transforms the description of the cells in a 2D space into a 1D
space is given by
⎡ ⎤
C11 C12 · · · C1n
⎢ C21 C21 · · · C2n ⎥
⎢ ⎥
⎢ .. .. .. ⎥ −→ [C11 C12 · · · C1n · · · Cij · · · Cm1 Cm2 · · · Cmn ] = C (8a)
⎣ . . C ij . ⎦
Cm1 Cm2 ··· Cmn
The elements of the vector C can be represented as Cp where p = j + (i − 1)n. The inverse
relation is given by the following expression:
p
i(p) = ; j(p) = p mod (n) (8b)
n
where • means that i(p) is the quotient of the division p=n.
In the sequel equation (7) will be rewritten in the following form using Equations (8a), (8b):
d mn
xp (t) = − xp (t) + wpq f(xq (t)) + sp ; p = 1; 2; : : : ; mn (9)
dt q=1
Comparing Equations (7) and (9), a relation between coecients wpq and the elements Akl of
the template matrix will be established. Dening the indices v and z with
v = i + k − r − 1; z=j + l − r − 1 (10a)
k = v − i + r + 1; l=z − j + r + 1 (10b)
p = j + (i − 1)n; q = z + (v − 1)n (10c)
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
398
E. BILGILI, İ. C. GOKNAR AND O. N. UCAN
Therefore in order to nd the equilibrium states one has to investigate the solutions of the
algebraic equation (15) for a given set of constant inputs
X = W:f(X ) + S where X = [x1 ; x2 ; : : : ; xi ] ; i = mn (15)
If we dene a function G, such that
G(X ) = W:f(X ) + S (16)
one can see that G(•) is continuous as the following conditions hold:
(i) Activation function is continuous and bounded.
(ii) All (A, B and I) template elements are nite.
(iii) All inputs and initial states are bounded.
Substituting Equation (16) for the right side of Equation (15), Equation (17) is obtained
X = G(X ) (17)
mn mn
which means that the equilibrium state Xe is a xed point of the function G : → . In
order to nd bounds on the function G, considering
mn
|(G(x))p | = wpq f(xq ) + sp (18)
q=1
and the triangle inequality with the maximum absolute value of the activation function being 1,
the inequality
mn
mn
mn
|(G(x))p |6 |wpq f(xq )| + |sp |6 |wpq | : |f(xq )| + |sp |6 |wpq | + |sp | (19)
q=1 q=1 q=1
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
CELLULAR NEURAL NETWORK WITH TRAPEZOIDAL ACTIVATION FUNCTION 399
These theorems can be applied without modication to a CNNwTAF since, TAF is also
Lipschitz continuous and takes values on the same interval as CNNwPWL, i.e. |fTAF (x)|61,
|fSPWL (x)|61 and since CNNwTAF also satises A1–A3.
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
400
E. BILGILI, İ. C. GOKNAR AND O. N. UCAN
Substitution of the right-hand side of Equation (9) for ẋi (t) yields:
mn
ėi (t) = ẋi (t) = − xi (t) + wiq f(xq (t)) + si (27)
q=1
is obtained.
An equilibrium point of Equation (29) is ee = [0; 0; : : : ; 0] ; to prove the global asymptotic
stability of CNNs described by Equation (13), it is sucient to prove the global asymp-
totic stability of the trivial solution of Equation (29) as TAF f(•) satises the Lipschitz
condition [8, 9].
Let the positive function dened by Equation (30) be selected as a Lyapunov function
candidate
1m:n
P(x) = ci (xi − xie )2 ; ci ¿ 0 for i = 1; 2; : : : ; mn (30)
2 i=1
Dierentiation of this function with respect to t along trajectories yields:
dP mn mn
= ci :ei (t) −ei (t) + wiq [f(eq (t) + xqe ) − f(xqe )] (31)
dt i=1 q=1
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
CELLULAR NEURAL NETWORK WITH TRAPEZOIDAL ACTIVATION FUNCTION 401
dP
mn 1 mn 1 mn
6− ci − |wiq | − |wqi |cq ei2 (t)
2
(37)
dt i=1 2 q=1 2 q=1
From Equations (11a), (11b) one can see that the matrix W has the property:
mn
mn 2r+1
2r+1
wiq = wqi 6 Akl (39)
q=1 q=1 k=1 l=1
mn 2c
|wiq |¡ (42)
q=1 c2 2 +1
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
402
E. BILGILI, İ. C. GOKNAR AND O. N. UCAN
since there is only one slope equal to 1 in PWL activation function, as shown in Figure 1;
these results are similar to the results in References [9, 17, 18].
3. APPLICATIONS
The templates in Equation (46) are symmetrical, so that standard CNN with these templates
is stable. In order to observe the trajectory of CNN, absolute value of the derivative of each
cell state has been calculated and the average absolute derivative value (AADV) of CNN,
dened in Equation (47), has been used as a metric to see the convergence to equilibrium
states
1 mn dx
p
AADV(t) = (47)
mn p=1 dt
Obviously, for an asymptotically stable CNN, AADV(t) should decrease asymptotically and
be zero after the transients.
Figure 3. Input images: (a) diamond; (b) Chinese character; (c) square; and (d) Lena.
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
CELLULAR NEURAL NETWORK WITH TRAPEZOIDAL ACTIVATION FUNCTION 403
Figure 4. CNNs outputs with edge detector templates: (a) diamond; (b) Chinese
character; (c) square; and (d) Lena.
Simulation results of CNN with above templates for input images are shown in Figure 4;
it can be observed that, edge-detecting templates are very powerful for binary input images
but, if the input image is in grey-scale like the Lena image, not all edges can be detected. It
has been observed that AADV for diamond, Chinese character and square-type input images
were similar; hence, only AADV of Chinese character and Lena will be illustrated in Figure 5
and the sequel. In these plots, time axis is scaled depending on RC time constants, where
R is the resistor and C is the capacitor value in the equivalent circuit of CNN; it can be
seen from the gures that CNN reaches steady state after 5RC as expected. The templates
in Equation (46) yield a stable CNN as they are symmetrical. Now, in order to satisfy
criterion (45), A template in Equation (46) is multiplied by 0.475 such that sum of absolute
values of A template elements is smaller than 1 in total. The new template A is
⎡ ⎤
0 0 0
⎢ ⎥
A= ⎢⎣ 0 0:95 0 ⎦
⎥ (48)
0 0 0
B and I templates do not change in this application, because they do not aect the stability of
CNN. Since the center element of the template A in Equation (48) is smaller than 1, output
images shown in Figure 6 are not binary but CNN is stable as shown in Figure 7. Since A has
been changed, the new CNN may not be an edge detector anymore; however, the main aim
of these applications being to investigate the stability of CNN, comparing Figures 5 and 7,
one can see that AADV of CNN is monotonically decreasing in Figure 7, but it can increase
for some time during the transient as in Figure 5.
In the next application, in order to observe the AADV of unstable CNN, templates B and
I remain the same, but the template A is converted to a kind of non-symmetrical template.
New template A is designed as follows:
⎡ ⎤
−0:25 0:50 −1:00
⎢ ⎥
A= ⎢ ⎣ 1:50 2:00 −1:75 ⎥ ⎦ (49)
−1:25 −0:75 0:25
Output images and AADV of CNN with new templates are given, respectively, in
Figures 8 with 9 showing that AADV of CNN is not decreasing monotonically. After 10RC
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
404
E. BILGILI, İ. C. GOKNAR AND O. N. UCAN
Figure 5. AADV of CNN for (a) Chinese character and (b) Lena.
Figure 6. CNN outputs with template A in Equation (48): (a) diamond; (b) Chinese
character; (c) square; and (d) Lena.
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
CELLULAR NEURAL NETWORK WITH TRAPEZOIDAL ACTIVATION FUNCTION 405
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
406
E. BILGILI, İ. C. GOKNAR AND O. N. UCAN
scaled time units from the beginning, AADV is not zero and has no tendency to go to zero.
In the next application, in order to make CNN stable with non-symmetrical templates, A in
Equation (49) has been divided by 10 to meet condition (45), since absolute values of its
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
CELLULAR NEURAL NETWORK WITH TRAPEZOIDAL ACTIVATION FUNCTION 407
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
408
E. BILGILI, İ. C. GOKNAR AND O. N. UCAN
Figure 12. Outputs of CNNwTAF by using non-symmetrical template A as given by Equation (49) and
the parameters of TAF by Equation (51): (a) diamond; (b) Chinese character; (c) square; and (d) Lena.
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
CELLULAR NEURAL NETWORK WITH TRAPEZOIDAL ACTIVATION FUNCTION 409
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
410
E. BILGILI, İ. C. GOKNAR AND O. N. UCAN
Figure 15. AADV of CNN with TAF by using stable non-symmetrical tem-
plate A in Equation (52) and the parameters of TAF in Equation (51) for:
(a) Chinese character; and (b) Lena.
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
CELLULAR NEURAL NETWORK WITH TRAPEZOIDAL ACTIVATION FUNCTION 411
By using the same approach as in the previous examples, if the template A that has been
dened in Equation (52) is used, CNNwTAF is stable with this template since the sucient
stability criterion is met. For this template A, output images and AADV of CNN-TAF are
shown in Figures 18 and 19.
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
412
E. BILGILI, İ. C. GOKNAR AND O. N. UCAN
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
CELLULAR NEURAL NETWORK WITH TRAPEZOIDAL ACTIVATION FUNCTION 413
Figure 21. Original images which were used for XOR applications:
(a) CCDMASK; and (b) binary image of Lena.
3.2.1. Parity 2. This kind of CNN performs the XOR on adjacent pixels of a line. ‘Given
the four possible combination of two adjacent pixels, the output will be white only if these
adjacent pixels are not of the same kind, otherwise the output is black’. This task is lin-
early non-separable and cannot be realized with a single layer standard uncoupled CNN. The
templates and the activation function parameters are
⎡ ⎤ ⎡ ⎤
0 0 0 0 0 0
⎢ ⎥ ⎢ ⎥
A=⎢
⎣0 0:125 0⎥
⎦; B= ⎢
⎣0 0:25 0:1 ⎥
⎦; I = − 0:075
(54)
0 0 0 0 0 0
[1 ; 2 ; 3 ; 4 ] = [−0:2; −0:1; 0:0; 0:1]
Two new input images, which have not been used in the previous applications of the paper,
are illustrated in Figure 21. Using these templates the outputs of CNNwTAF corresponding
to these special input images are shown in Figure 22. Using the transpose of template B,
everything else remaining the same, XOR on adjacent pixels of a vertical line is performed.
The outputs for this task are shown in Figure 23.
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
414
E. BILGILI, İ. C. GOKNAR AND O. N. UCAN
Figure 22. CNN outputs for horizontal parity-2 applications: (a) square;
(b) diamond; (c) Lena; and (d) CCDMASK.
3.2.2. Parity 2 on three adjacent pixels. This kind of CNN performs the XOR operation
on adjacent three pixels of a line according to ‘given the three adjacent pixels of a line, the
output will be white if the pixel at the centre and at least one of the neighbouring pixels is
not of the same kind; otherwise, if all three pixels are of the same kind, the output is black’.
This task is also linearly non-separable; choosing the templates and the activation function
parameters as in Equation (55), the outputs to the same test input set are shown in Figure 24.
⎡ ⎤ ⎡ ⎤
0 0 0 0 0 0
⎢ ⎥ ⎢ ⎥
A=⎢
⎣0 0:125 0⎥
⎦; B= ⎢
⎣ 0:1 0:25 0:1 ⎥
⎦; I = − 0:075
(55)
0 0 0 0 0 0
[1 ; 2 ; 3 ; 4 ] = [−0:3; −0:2; 0:1; 0:2]
If the template B is transposed, one can use this templates and same activation function for
performing same function on the vertical lines.
3.2.3. Detecting the location where homogeneity is spoiled. The last linearly non-separable
task example is dened as ‘given nine neighbour pixels, one of them is located at the centre
and the other eight pixels around the centre pixel to form a square; the output is black if
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
CELLULAR NEURAL NETWORK WITH TRAPEZOIDAL ACTIVATION FUNCTION 415
Figure 24. CNN outputs for parity-2 considering 3-pixels in the same vertical line:
(a) square; (b) diamond; (c) Lena; and (d) CCDMASK.
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
416
E. BILGILI, İ. C. GOKNAR AND O. N. UCAN
Figure 25. CNN outputs for a special linearly non-separable task application:
(a) square; (b) diamond; (c) Lena; and (d) CCDMASK.
all the pixels are of the same kind, otherwise it is white’. The templates and the activation
function parameters are chosen as
⎡ ⎤ ⎡ ⎤
0 0 0 0:1 0 0:1
⎢ ⎥ ⎢ ⎥
A=⎢ ⎥ ⎢ ⎥
⎣ 0 0:125 0 ⎦ ; B = ⎣ 0:1 0:25 0:1 ⎦ ; I = − 0:075
(56)
0 0 0 0:1 0:1 0:1
[1 ; 2 ; 3 ; 4 ] = [−0:9; −0:8; 0:7; 0:8]
and the corresponding outputs are illustrated in Figure 25.
4. CONCLUSION
In this paper, CNN with a new kind of non-linearity, namely trapezoidal activation function
(CNNwTAF) has been introduced; CNNwTAF is a generalization of the single perceptron-
like cell with double threshold which was presented in References [11, 12]. Sucient stability
criterion has been obtained for CNNwTAF by transforming the 2D state equations of CN-
NwTAF into regular 1D state equations then using xed-point and Lyapunov stability theo-
rems; that this stability criterion can also be applied to CNN with PWL activation function has
also been shown. The theoretical results have been tested on various well-known examples.
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417
CELLULAR NEURAL NETWORK WITH TRAPEZOIDAL ACTIVATION FUNCTION 417
Another improvement on the stability criterion is that, stable templates can be designed with-
out possessing the symmetry property. In particular, stable non-symmetrical templates will be
very useful in texture classication and pattern recognition elds. Another advantage provided
by CNNwTAF is the extra design exibility made possible by the availability of tunable
parameters of the activation function which can be adjusted jointly with or disjoint from the
template parameters. Finally, the advantage of CNNwTAF in providing a single layer CNN
capable of performing linearly non-separable tasks such as XOR and parity problems has
been demonstrated with several examples but needs further elaboration and application to
real-life problems such as anomaly detection in geophysics, early epileptic seizure warning
in medicine.
REFERENCES
1. Sziranyi T, Csapodi M. Texture classication and segmentation by cellular neural network using genetic learning.
Research Report, Analogical and Neural Computing Systems Research Laboratory, Computer and Automation
Institute, Hungarian Academy of Sciences (MTA-SZTAKI), Budapest, Hungary, November 1, 1994.
2. Wang L, Pineda de Gyvez J, Sinencio ES. Time multiplexed color image processing based on a CNN with
cell-state outputs. IEEE Transactions of VLSI Systems 1998; 6(2):314 –322.
3. Chen CW, Chen L. Cellular neural network architecture for Gibbs random eld based image segmentation.
Journal of Electronic Imaging 1998; 7(1):45 –51.
4. Chua LO, Yang L. Cellular neural networks: theory. IEEE Transactions on Circuit and Systems 1988;
35(10):1257–1272.
5. Chua LO, Yang L. Cellular neural networks: applications. IEEE Transactions on Circuit and Systems 1988;
35(10):1273 –1288.
6. Civalleri PP, Gilli M. Some dynamic phenomena in delayed cellular neural networks. International Journal of
Circuit Theory and Applications 1994; 22(2):77–105.
7. Civalleri PP, Gilli M. On the dynamic behavior of two-cell cellular neural networks. International Journal of
Circuit Theory and Applications 1993; 21(5):451– 471.
8. Guzelis C, Chua LO. Stability analysis of generalized cellular neural networks. International Journal of Circuit
Theory and Applications 1993; 21(1):1–33.
9. Zhou D-M, Zhang L-M. Global exponential stability of cellular neural networks. Proceedings of International
Conference on Neural Information Processing, ICONIP 2001, 8, Shangai, China, November 14–18, 2001.
10. Majorana S, Chua LO. A unied framework for multilayer high order CNN. International Journal of Circuit
Theory and Applications 1998; 26(6):567–592.
11. Aksin YD, Aras S, Goknar IC. CMOS realization of user programmable, single-level, double-threshold
generalized perceptron. Proceedings of Turkish Articial Intelligence and Neural Networks Conference,
TAINN-2000, Izmir, Turkey, June 2000.
12. Aksin YD, Aras S, Goknar IC. Realization and extensions of user programmable, single-level, double-threshold
generalized perceptron. Istanbul University Journal of Electrical and Electronics Engineering (IU-JEEE) 2001;
1(1):123 –129.
13. Vidyasagar M. Nonlinear System Analysis. Electrical Engineering Series. Prentice-Hall: Englewood Clis, NJ,
1978.
14. Larsen R. Functional Analysis. Marcel Dekker Inc.: New York, 1973.
15. Hille E. Methods in Classical and Functional Analysis. Addison-Wesley: New Mexico, 1972.
16. Chua L, Roska T. Cellular Neural Networks and Visual Computing. Cambridge University Press: Cambridge,
MA, 2002.
17. Liao X, Yu J, Chen G. Novel stability criteria for bidirectional associative memory neural networks with time
delays. International Journal of Circuit Theory and Applications 2002; 30(5):519 –546.
18. Zhang J. Absolutely exponential stability in delayed cellular neural networks. International Journal of Circuit
Theory and Applications 2002; 30(4):395 – 409.
Copyright ? 2005 John Wiley & Sons, Ltd. Int. J. Circ. Theor. Appl. 2005; 33:393–417