Professional Documents
Culture Documents
Neurocomputing
journal homepage: www.elsevier.com/locate/neucom
a r t i c l e i n f o a b s t r a c t
Article history: It is well known that the prototype patterns in associative memories can be represented by stable
Received 6 September 2011 equilibrium points of cellular neural networks (CNNs). Therefore, the stability of equilibrium points of
Received in revised form CNNs is critical in associative memories based on CNNs. In this paper, some criteria about the stability
14 June 2012
of CNNs are established. In fact, these criteria give some constraint conditions for the relationship of
Accepted 17 June 2012
parameters of CNNs. Compared with the previous works, our results relax the conservatism of the
Communicated by J. Liang
Available online 1 July 2012 relationship of parameters and extend the range of the values of parameters. Two design procedures on
the parameters of CNNs are given to achieve associative memories under our criteria. Finally, an
Keywords: example is given to verify the theoretical results and design procedures.
Cellular neural networks
& 2012 Elsevier B.V. All rights reserved.
Associative memories
Cloning template
0925-2312/$ - see front matter & 2012 Elsevier B.V. All rights reserved.
http://dx.doi.org/10.1016/j.neucom.2012.06.017
Q. Han et al. / Neurocomputing 97 (2012) 192–200 193
CNNs to be stable were obtained by constructing Lyapunov methods, such as the method in [28]. In order to get the values
Function [17–24], and these conditions generally made equili- of these two matrices, we need to transform these two matrices in
brium point global asymptotically stable. However, some authors two vectors. The methods will be shown in Section 3.2.
presented some conditions which made equilibrium points locally Let a ¼ ða1 , a2 , . . . , an ÞT A U n ¼ xi A Rn xi ¼ 1 or xi ¼ 1,i ¼ 1,2,
stable, and there generally were multiple equilibrium points . . . ,ng, CðaÞ ¼ x A Rn xi ai 4 1,i ¼ 1,2, . . . ,ng. Then, for x A CðaÞ, the
[13,14,25–27]. Eq. (2) can be rewritten as
In previous papers, the conditions of stability of CNNs are x_ ¼ Cx þ Aa þDU þ V ð3Þ
conservative. For example, bias vectors were computed by one of
all memory patterns in [7–9]; the relations between cloning If b is an equilibrium point of (3), then we have
templates were stronger and closer in [13–16]. Therefore, the b ¼ C 1 ðAa þ DU þVÞ A CðaÞ ð4Þ
aim of the paper is to relax conservative relationship among
parameters of CNNs. In order to get our theories, we choose the
initial states of CNNs are zero. Our theories expend the scope of
Lemma 1. [7] Suppose a ¼ ða1 , a2 , . . . , an ÞT A U n . If b ¼ ðb1 , b2 , . . . ,
the values of parameters of CNNs. When the inputs and outputs of
bn ÞT ¼ C 1 ðAa þ DU þ VÞ A CðaÞ, then b is an asymptotically stable
a CNN are given, the values of parameters can be obtained by our
equilibrium point of (2).
methods. In fact, if we get appropriate values of parameters of
CNNs, we can realize associative memories. Therefore, from the Proof. Eq. (3) has a unique equilibrium point at xe ¼ C 1 ðAa þ
above theoretical analysis, we give design procedures of associa- DU þVÞ, and xe ¼ b A CðaÞ by assumption. Therefore, this equili-
tive memories based on CNNs. brium is also asymptotical stable, since Eq. (3) has all its n
The remaining parts of this paper are organized as follows. In eigenvalues at ci , i ¼ 1,2, . . . ,n.
Section 2, a class of CNNs are given. In Section 3, the relationship
among parameters of CNNs is given by a theorem and some 3. Main result
corollaries. These theories give some methods about how to get
the value of parameters A, D and V of a CNN. In Section 4, two In this section, we will give some theories about stability of
design procedures on associative memories and a flow chart CNNs firstly. Then, some methods are obtained on the basis of
about how to get parameters of a CNN are given. In Section 5, these theories for realizing associative memories based on CNNs.
an example is given to verify the theoretical results and design
procedures. Some conclusions are finally drawn in Section 6. 3.1. Stability of CNNs
P P
(ii) If nj¼ 1 aij aj þ nj¼ 1 dij uj þ vi oci , then the Eq. (5) converges First, we discuss how to get cloning template A and D of a CNN.
to a negative stable equilibrium point, and the value of We can make use of the inputs and outputs of cells in set R to get
negative equilibrium point is less than 1. A and D. Because of the all values of vi (i¼1,2,yn) of cells in set R
are equal to zero, choose that all vi of all cells in set R are equal to
P
The proof of Corollary 2 is same to that of Theorem 1. zero. Then, by Corollaries 3 or 4, the relationship among nj¼ 1 aij aj ,
Pn
If we choose vi ¼ 0 in Theorem 1, then we can get the following j ¼ 1 dij uj , ci (i¼1,yn) can be established. Then, we use Theorem
After transforming Eqs. (9)–(12), from LA and LD, it is easy to model of the paper is similar with that of [15], therefore, we do
obtain parameters A and D. not proof the problems of high-capacity.
0 00
Remark 2. When matrices O ^ ,O ^ or X^ are irreversible, the values
of LA or LD are approximate values. 4. Design procedure of a CNN
Next, we discuss how to get bias vi for a cell in sets P and Q.
In this section, two design procedures of parameters of a CNN
In set R, the value of vi of a cell are equal to zero. However, in sets P
are given by the above theories.
and Q, the values of vi (Oi A P,Q ) of a cell are not equal to zero.
Therefore, we need to calculate the value. Parameters A and D can be (i) If Theorem 1 and Corollary 3 are chosen to achieve associative
obtained by Corollary 3 or 4. The ranges of vi (i¼ 1,yn) can be memories, we give the following design procedure of para-
obtained by Theorem 1 or Corollary 1. From Theorem 1, we know meters of a CNN. We can use Corollary 3 to get parameters A
P P
that when vi 4ci nj¼ 1 aij aj nj¼ 1 dij uj , the Eq. (5) converges to a and D in set R and use Theorem 1 to get biases vi (i¼1,yn) for
positive stable equilibrium point, which means that the output of a cell in sets P and Q.
cellOi is positive. Because the sign of outputs of all cells in set P is Step 1. Denote a matrix G ¼ ða1 , a2 , . . . , am Þ as memory pat-
P terns of associative memories, where ai is a set of outputs of
positive, we can choose that v^ 4 max fci nj¼ 1 aij aj
þ
1rirn all cells in a CNN, and m is the number of patterns of
Pn associative memories. Denote a input matrix
^
j ¼ 1 dij uj g is the bias of all cells in set P. Similarly, v o min
n
1rirn U ¼ ðU 1 ,U 2 , . . . ,U m Þ which is corresponding to the matrix of
Pn Pn
ci j ¼ 1 aij aj j ¼ 1 dij uj is the bias of all cells in set Q. Then, memory patterns G.
þ Step 2. Divide all cells of the CNN into three sets. If all outputs
the regions of v^ and v^ can be get by Theorem 1 or Corollary 1. We of a cell in all memory patterns are 1, the cell will be classified
þ
can consider the regions of v^ and v^ from two aspects as follows. as set P. If all outputs of a cell in all memory patterns are 1,
(i) If Theorem 1 is used to realize associative memories based the cell will be classified as set Q. If the outputs of a cell in all
on a CNN, we can get the following result: memory patterns are 1 and 1, the cell will be classified as
P P
If ai ¼ 1 in Eq. (5), we get vi ðlÞ 4ci nj¼ 1 aij alj nj¼ 1 dij ulj ¼ set R.
0
xi ðlÞ. Step 3. Choose biases vi ðOi A RÞ are equal to zero.
P P Step 4. Determine all state constants cij , 1r i rN,1 r j rM.
If ai ¼ 1 in Eq. (5), we get vi ðlÞ oci nj¼ 1 aij alj nj¼ 1 dij ulj ¼
0 Obtain coefficient matrices C.
xi ðlÞ.
Step 5. Determine matrixLsuch that li 4 max fci g, and get
Therefore, we choose 1rirn
matrix L0 .
þ 0 Step 6. Compute cloning template A by use of Eq. (14) and the
v^ Z max 9fxi ðlÞg9 ð21Þ
1 r l r m,Oi A P relationship between inputs and outputs of cells in set R.
Obtain coefficient matrices A.
as bias of all cells in set P, and Step 7. Compute cloning template D by use of Eq. (18) and the
0 relationship between inputs and outputs of cells in set R.
v^ r max 9fxi ðlÞg9 ð22Þ
1 r l r m,Oi A Q Obtain coefficient matrices D.
0 þ
Step 8. Compute xi ðlÞ(Oi A P,1 rl r m) in set P. Choose v^ Z
0
as bias of all cells in set Q. max 9fxi ðlÞg9 in terms of (21), and biases vi ðOi A PÞ are
1 r l r m,Oi A P
(ii) If Corollary 1 is used to realize associative memories based þ
equal to v^ .
on a CNN, we can get the following results: 0
P P Step 9. Compute xi ðlÞ(Oi A Q ,1 r l rm) in set Q. Choose
If ai ¼ 1 in Eq. (5), we get vi ðlÞ 4 nj¼ 1 aij alj nj¼ 1 dij ulj ¼ xi ðlÞ.
00
Pn P ^v o max
0
9fxi ðlÞg9 in terms of (22), and biases
If ai ¼ 1 in Eq. (5), we get vi ðlÞ o j ¼ 1 aij alj nj¼ 1 dij 1 r l r m,Oi A Q
00
ulj ¼ xi ðlÞ. vi ðOi A Q Þ are equal to v^ .
Therefore, we choose Step 10. Synthesize the CNN with the connection weight
matrices A, C, D and bias vector V.
þ
v^ Z max
00
9fxi ðlÞg9 ð23Þ From these above ten steps, we give the flow chart as in Fig. 1:
1 r l r m,Oi A P (ii) If Corollary 1 and Corollary 4 are chosen to realize associative
memories, we give the following design procedure for a CNN:
as bias of all cells in set P, and Step 1, Step 2, Step 3 and Step 4 of (ii) are same to these of (i).
00 Step 5. Determine matrix L such that li 4 0, and get matrix
v^ r max 9fxi ðlÞg9 ð24Þ
1 r l r m,Oi A Q L0 .
Step 6. Determine a00 such that a00 Z max fci g holds.
1rirn
as bias of all cells in set Q.
Step 7. Compute cloning template A from (16) in set R. Obtain
coefficient matrices A.
Remark 3. Cloning template D is computed by the cells in set R.
Step 8. Compute cloning template D from (20) in set R. Obtain
Therefore, the values of D are not be affected by the cells in sets P
coefficient matrices D.
and Q, which can make the values of D more accurate than that in 00
[15]. Bias vector is computed by the cells in sets P and Q, which Step 9. Compute xi ðlÞ(Oi A P, 1 r l rm) in set P. Choose
þ 00
make the outputs of cells in sets P and Q always right in the v^ 4 max 9fxi ðlÞg9 in terms of (23), and bias vi of all
1 r l r m,Oi A P
process of associative memories. þ
cells in set P is equal to v^ .
00
Step 10. Compute xi ðlÞ(Oi A Q ,1 r l r m) in set Q. Choose
Remark 4. In [15], the authors proofed that associative memories 00
v^ o max 9fxi ðlÞg9 in terms of (24), and bias vi of all
based on CNNs are high-capacity (High-capacity means that a 1 r l r m,Oi A Q
large number of memory patterns can be stored by a CNN). The cells in set Q is equal to v^ .
196 Q. Han et al. / Neurocomputing 97 (2012) 192–200
Step 11. Synthesize the CNN with the connection weight reduce many limitations for the relationship among cloning
matrices A, C, D and bias vector V. templates.
From the above ten steps, we can get a CNN which can realize
associative memories from ‘‘MO’’ to ‘‘LS’’. In Fig. 3, when the
inputs of the CNN are ‘‘O’’, we can get time response curves of all
5. Numerical examples cells in the CNN in Fig. 3. In Fig. 3, we find that states of all cells
will be stable after a time. When states of all cells are stable, the
In this section, we will give some numerical simulations to value of equilibrium point is
verify the theoretical results in this paper. xn ¼ ð22:5671,20:7718,4:1968,3:9240,19:7945,
Consider the same example introduced in [16]. The inputs and
20:2415,18:5446,26:3927,29:7259,27:6048,
the output patterns of a CNN are represented by two pairs of
20:2415,19:7718,4:7119,3:2877,24:4686,
(5 5) pixel images showed in Figs. 1 and 2 (black pixel ¼1, white
20:2415,3:7271,18:1355,4:7119,23:6959,
pixel ¼ 1), where the inputs of the CNN compose the word ‘‘MO’’
in Fig. 2(a), and the patterns to be memorized to constitute the 20:1657,20:3248,26:3700,28:2790,21:1051ÞT :
word ‘‘LS’’ in Fig. 2(b). Therefore, we know that all outputs of the CNN corresponding
We design all parameters of a CNN to realize associative to the equilibrium point are
memories for patterns in Fig. 2 by use of design procedure (i) in
Section 4. f ðxn Þ ¼ ð1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,
1,1,1,1,1,1,1,1,1ÞT ,
Step 1. In terms of Fig. 2, we get memory patterns G ¼ ðð1,1, where the outputs of the CNN are same with ‘‘S’’.
1, . . . ,1ÞT , ð1,1,1, . . . ,1ÞT Þ and inputs matrix U ¼ ðð1,1, When the inputs of the CNN are ‘‘M’’, we can get time response
1, . . . ,1ÞT , ð1,1,1, . . . ,1ÞT Þ. curves of all cells in the CNN in Fig. 4. Then the value of
Step 2. From memory patterns, all cells of the CNN can be equilibrium point is
divided into three sets, P ¼ fO2 ,O7 ,O12 ,O22 ,O23 ,O24 g,
Q ¼ fO1 ,O5 , O6 ,O8 ,O9 ,O10 ,O11 ,O15 ,O16 ,O18 ,O20 ,O21 ,O25 g and xn ¼ ð20:9309,23:3777,4:0755,3:9543,22:3247,
R ¼ fO3 ,O4 ,O13 , O14 ,O17 , O19 g. 23:5141,26:4382,16:0296,18:7113,21:1657,
Step 3. Choose biases of all cells in set R are equal to zero,
namely, v3 ¼ v4 ¼ v13 ¼ v14 ¼ v17 ¼ v19 ¼ 0.
Step 4. Let cij ¼ 1,1r ir N,1 rj rM, then we can obtain
C ¼ diag ð1,1, . . . ,1Þnn .
Step 5. Let L ¼ diag ð4,4, . . . ,4Þnn , then we have L0 ¼ diag
ð4,4, . . . ,4Þnmnm .
Step 6. From Eq. (14), we get LA ¼ ð0,0,0,0,2,0,0,0,0ÞT . Then, we
haveA ¼ diag ð2Þnn .
Step 7. From Eq. (18), we get LD ¼ ð1:5682,2:9394,
0:6136,1:6515,0:6136, 0:1212,0:3864,0:3258,0:25ÞT . Then,
we can obtain D.
þ
Step 8. Let v^ ¼ 20 in set P. Therefore, v2 ¼ v7 ¼ v12 ¼ v22 ¼
v23 ¼ v24 ¼ 20.
Step 9. Let v^ ¼ 20 in set Q. Therefore, v1 ¼ v5 ¼ v6 ¼ v8 ¼
v9 ¼ v10 ¼ v11 ¼ v15 ¼ v16 ¼ v18 ¼ v20 ¼ v21 ¼ v25 ¼ 20.
Step 10. Synthesize the CNN with A, C, D and V.
Fig. 1. Flow chart about how to get parameters of a CNN by use of Theorem 1 and Corollary 3.
Q. Han et al. / Neurocomputing 97 (2012) 192–200 197
Fig. 3. When the inputs of the CNN are ‘‘O’’, time response curves of all cells of a
CNN are shown.
Fig. 5. When the inputs of a CNN are contaminated by the noise, the memory
patterns also can be obtained. (a) Inputs of a CNN and (b) outputs of the CNN or
memory patterns.
6. Conclusions
f ðxn Þ ¼ ð1,1,1,1,1,1,1,1,1,1,1,1,1,1, This work was supported in part by the National Natural
1,1,1,1,1,1,1,1,1,1,1Þ , T Science Foundation of China under Grant 60973114, Grant
61170249 and Grant 61003247, in part by the Natural Science
where the outputs of the CNN are same to ‘‘L’’. Foundation project of CQCSTC under Grant 2009BA2024 and
Therefore, simulation results show that the CNN with the Grant 2010BB2284, in part by the State Key Laboratory of Power
inputs(MO) is able to recall the output patterns (LS) effectively Transmission Equipment and System Security and New Technol-
and efficiently. ogy, Chongqing University, under Grant 2007DA10512711206, in
part by Teaching and Research Program of Chongqing Education
Remark 5. We use the values in the example in [16] to design a Committee (KJ110401), in part by the First Batch of Supporting
CNN. When the inputs of the CNN are ‘‘MO’’, we find that the Program for University Excellent Talents in Chongqing, and in part
outputs of the CNN are not exactly same as ‘‘LS’’. Therefore, our by Research Project of Chongqing University of Science and
design methods are better than the methods of [16]. Technology under Grant CK2011Z17.
198 Q. Han et al. / Neurocomputing 97 (2012) 192–200
Denote the expression of template A as follows: (i) When ai ¼ 1, Eq. (6) can be rewritten as
2 3 0 1
ar,r . . . ar,0 . . . ar,r X
n X
n
6 ^
þ
xi ¼ aii =ci þ @ aij aj þ dij uj þ vi A=ci ð7Þ
6 & ^ & ^ 7 7
6 7 j ¼ 1,j a i j¼1
6
A ¼ 6 a0,r . . . a00 . . . a0,r 7 :
7
6 ^ & ^ & ^ 7 Pn Pn
4 5 When aii =ci Z 1, j ¼ 1,j a i aij aj þ j ¼ 1 dij uj þ vi 4 0 and
ar,r . . . ar,0 . . . ar,r þ
xi ð0Þ ¼ 0, we have xi 41.
ð2r þ 1Þð2r þ 1Þ
(ii) When ai ¼ 1, Eq. (6) can be rewritten as
The definition of matrix D is similar to A. 0 1
The matrix A ¼ ðaij Þnn , defined by (2), composed of template X n X n
xi ¼ aii =ci þ @ aij aj þ dij uj þ vi A=ci ð8Þ
has the form
j ¼ 1,j a i j¼1
2 3
A1 A2 0 0 ::: 0 0
6A Pn Pn
6 3 A1 A2 0 ::: 0 0 77 If aii =ci Z 1, j ¼ 1,j a i aij aj þ j ¼ 1 dij uj þvi o 0 and xi ð0Þ ¼ 0,
6 7
6 0 A3 A1 A2 ::: 0 0 7 we have xi o 1.
6 7
6 0 0 A3 A1 ::: 0 0 7
6 7 ,
6 7
6 ^ ^ ^ ^ & ^ ^ 7
6 7 Appendix C. Notations
6 0 0 0 0 0 A1 A2 7
4 5
0 0 0 0 0 A3 A1 Suppose that there exists a set of memory patterns, and these
nn
2 3
a00 a01 0 ... 0 0 memory patterns can be written as a matrix G ¼ ða1 , a2 , . . . , am Þ,
6a 0 7 where ai ¼ ðai1 , ai2 , . . . , ain ÞT A Un and ai is a set of outputs of all cells
6 0,1 a00 a01 . . . 0 7
6 7 of a CNN. Therefore, G has n rows and m columns. Choose a set of
6 0 a a . . . 0 0 7
6 0,1 00 7
A1 ¼ 6 7 input patterns, and these input patterns can be written as a
6 ^ ^ ^ & ^ ^ 7
6 7 matrix U ¼ ðU 1 ,U 2 , . . . ,U i , . . . ,U m Þ, where U i ¼ ðui1 ,ui2 , . . . ,uij ,
6 0 0 0 ... a00 a01 7
4 5 . . . ,uin ÞT and uij denotes an input data of the jth cell of a CNN in
0 0 0 . . . a0,1 a00 the ith memory pattern.
MM
2 3 Then, we divide all cells of a CNN into three small sets on the
a10 a11 0 ... 0 0 basis of the memory patterns G. When all outputs of a cell of a
6a a10 a11 ... 0 0 7 CNN in all memory patterns are 1, we classify the cell as set
6 1,1 7
6 7
6 0 a1,1 a10 ... 0 0 7 P ¼ fOa ,Ob , . . .g, where Oa denotes ath cell in Eq. (2). When all
6 7
A2 ¼ 6 7 outputs of a cell of the CNN in all memory patterns are 1, we
6 ^ ^ ^ & ^ ^ 7
6 7 classify the cell as set Q ¼ fOc ,Od , . . .g. When all outputs of a cell of
6 0 0 0 ... a10 a11 7
4 5 the CNN in different ai are different, namely, the outputs can be
0 0 0 ... a1,1 a10
2 MM
3 1 and 1, we classify the cell as set R ¼ fOe ,Of , . . .g. We assume
a1,0 a1,1 0 ... 0 0 that the number of elements in set P is k1 , the number of elements
6a a1,0 a1,1 ... 0 0 7 in set Q is k2 , and the number of elements in set R is k3 .
6 1,1 7
6 7
6 0 a1,1 a1,0 ... 0 0 7 Choose l A f1,2, . . . ,mg, q A f1,2, . . . ,ng and k A fR1 ,R2 , . . . ,Rk3 g.
6 7
and A3 ¼ 6 7 Next, we introduce some symbols as follows:
6 ^ ^ ^ & ^ ^ 7
6 7 0 1
6 0 0 0 ... a1,0 a1,1 7 aii 0 . . . 0
4 5
B C
0 0 0 ... a1,1 a1,0 B 0 aii . . . 0 C
MM Let LA ¼ B B ^
C ,
@ ^ & ^ C A
The definition of matrix D ¼ ðdij Þnn is similar to A.
0 0 . . . aii
nn
0 1
Appendix B1. Proof of Theorem 1 l1 0 ... 0
B C
B0 l2 ... 0 C
L¼B
B ^
C
In Eq. (5), there exists an unique equilibrium point @ ^ & ^ C
A
0 1
Xn X
n 0 0 ... ln
nn
xni ¼ @ aij aj þ dij uj þ vi A=ci ð6Þ
j¼1 j¼1 where, li 4 0
A~ ¼ ALA , D
^ ¼ ðDR ,DR , . . . ,DR ÞT A^ ¼ ðAR ,AR , . . . ,AR ÞT
1 2 k 1 2 k
3 3
^ ¼ ðLR , LR , . . . , LR ÞT
and L
Pn Pn 1 2 k 3
(i) If j¼1 aij aj þ dij uj þ vi 4ci and xi ð0Þ ¼ 0, we have xni 41
j¼1
Pn Pn where YRi denotes Ri th row of matrix Y.
in Eq. (6). Therefore, when j ¼ 1 aij aj þ j ¼ 1 dij uj þvi 4 ci ,
then Eq. (5) converges to a positive stable equilibrium point LD ¼ ðd1,1 ,d1,0 ,d1,1 ,d0,1 ,d0,0 ,d0,1 ,d1,1 ,d1,0 ,d1,1 Þ,
by Lemma 1, and the value of positive equilibrium point is
bigger than 1. LA ¼ ða1,1 ,a1,0 ,a1,1 ,a0,1 ,a0,0 ,a0,1 ,a1,1 ,a1,0 ,a1,1 Þ,
Pn Pn
j ¼ 1 dij uj þvi o ci and xi ð0Þ ¼ 0, we have
(ii) If j ¼ 1 aij aj þ 0 1
Pn Pn L 0 ... 0
xi o1 in Eq. (6). Therefore, when
n
j ¼ 1 aij aj þ j ¼ 1 dij uj þ B0 L ... 0C
vi o ci , then Eq. (5) converge to a negative stable equilibrium L0 ¼ B
B
C
C ,
@^ ^ & ^A
point by Lemma 1, and the value of negative equilibrium
point is less than 1. 0 0 0 L ðNMmÞðNMmÞ
Q. Han et al. / Neurocomputing 97 (2012) 192–200 199
0
^ ¼ ðF0 , F0 , . . . , F0 , F0 0 0 T
F R1 R2 Rk NM þ R1 , . . . FlnNM þ Rk . . . FmnNM þ Rk Þ ,
3 3
X
n X
n
x00q ðlÞ ¼ aqj alj dqj ulj :
j¼1 j¼1
F ¼ ððA~ a1 ÞT ,ðA~ a2 ÞT , . . . ,ðA~ am ÞT ÞT ,
00
00
^ ¼ ðF00 , F00 , . . . , F00 , F00 00 00 T
F R1 R2 Rk NM þ R1 , . . . FlnNM þ Rk . . . FmnNM þ Rk Þ ,
3 3
0 1 References
0 ulðq1ÞM þ 1 ulðq1ÞM þ 2
B C
B ul l
ulðq1ÞM þ 3 C
B ðq1ÞM þ 1 uðq1ÞM þ 2 C [1] L.O. Chua, L. Yang, Cellular neural networks: theory, IEEE Trans. Circuits Syst.
B C 35 (1988) 1257–1272.
B ul ulðq1ÞM þ 3 ulðq1ÞM þ 4 C
Xlq ¼ B C
ðq1ÞM þ 2 [2] L.O. Chua, L. Yang, Cellular neural networks: applications, IEEE Trans. Circuits
B C , Syst. 35 (1988) 1273–1290.
B ^ ^ ^ C
B C [3] K.R. Crounse, L.O. Chua, Methods for image processing and pattern formation
B ul u l l
uqM C
B qM2 qM1 C in cellular neural networks: a tutorial, IEEE Trans. Circuits Syst. (1995)
@ A 583–601.
ulqM1 ulqM 0 [4] T. Sziranyi, J. Csicsvari, High-speed character recognition using a dual cellular
M3
0 1 neural network architecture (CNND), IEEE Trans. Circuits Syst. II 40 (1993)
0 Xl1 Xl2 223–231.
B l C 0 1 [5] M. Brucoli, L. Carnimeo, G. Grassi, Discrete-time cellular neural networks for
B X1
B Xl2 Xl3 CC X1 associative memories with learning and forgetting capabilities, IEEE Trans.
B l l l C B 2 C
l
B X2 X 3 X 4
C BX C Circuits Syst. I 42 (1995) 396–399.
X ¼B B ^
C , X¼B C , [6] A. Gionis, P. Indyk, R. Motwani, Similarity search in high dimensions via
B ^ ^ CC
B ^
@
C
A hashing, in: Proceedings of the 25th VLDB Conference, Edinburgh, Scotland,
B l C
BX
@ N2 X l
N1 X l C
N A Xm ðNMmÞ9
1999.
[7] D.R. Liu, A.N. Michel, Cellular neural networks for associative memories, IEEE
XlN1 XlN 0 Trans. Circuits Syst. II 40 (1993) 119–121.
ðNMÞ9
[8] D.R. Liu, A.N. Michel, Sparsely interconnected neural networks for associative
memories with applications to cellular neural networks, IEEE Trans. Circuits
^ ¼ ðXR , XR , . . . , XR , XNM þ R , . . . XlnNM þ R . . . XmnNM þ R ÞT ,
X Syst. II 41 (1994) 295–307.
1 2 k3 1 j k3
[9] D.R. Liu, Cloning template design of cellular neural networks for associative
0 1 memories, IEEE Trans. Circuits Syst. I: Fundam. Theory Appl. 44 (1997)
0 alðq1ÞM þ 1 alðq1ÞM þ 2 646–650.
B C [10] M. Brucoli, L. Carnimeo, G. Grassi, Discrete-time cellular neural networks for
B al alðq1ÞM þ 2 alðq1ÞM þ 3 C
B ðq1ÞM þ 1 C associative memories with learning and forgetting capabilities, IEEE Trans.
B C Circuits Syst. I: Fundam. Theory Appl. 42 (1995) 396–399.
B al alðq1ÞM þ 3 alðq1ÞM þ 4 C
Oq ¼ B
l
B
ðq1ÞM þ 2 C
C , [11] G. Grassi, A new approach to design cellular neural networks for associative
B ^ ^ ^ C memories, IEEE Trans. Circuits Syst. I: Fundam. Theory Appl. 44 (1997)
B C
B al a l l
a C 835–838.
B qM2 qM1 qM C [12] G. Grassi, On discrete-time cellular neural networks for associative mem-
@ A
l
a qM1 alqM 0 ories, IEEE Trans. Circuits Syst. I: Fundam. Theory Appl. 48 (2001) 107–111.
M3 [13] Z.G. Zeng, D.S. Huang, Z.F. Wang, Memory pattern analysis of cellular neural
0 1 networks, Phys. Lett. A 342 (2005) 114–128.
0 Ol1 Ol2 [14] Z.G. Zeng, D.S. Huang, Z.F. Wang, Pattern memory analysis based on stability
B C 0 1
B Ol1 Ol2 Ol3 C O1 theory of cellular neural networks, Appl. Math. Modelling 32 (2008)
B C 112–121.
B C B 2 C
l B Ol2 Ol3 Ol4 C BO C [15] Z.G. Zeng, J. Wang, Design and analysis of high-capacity associative memories
O ¼B
B ^
C , O¼B C ,
B ^ ^ CC
B ^
@
C
A
based on a class of discrete-time recurrent neural networks, IEEE Trans. Syst.
B l C Man Cybern., Part B: Cybern. 38 (2008) 1525–1536.
BO
@ N2 OlN1 OlN C
A Om ðNMmÞ9 [16] Z.G. Zeng, J. Wang, Associative memories based on continuous-time cellular
neural networks designed using space-invariant cloning templates, Neural
OlN1 OlN 0 Networks 22 (2009) 651–657.
ðNMÞ9
[17] M. Gilli, M. Biey, P. Checco, Equilibrium analysis of cellular neural networks,
0 1 IEEE Trans. Circuits Syst. I: Regular Papers 51 (2004) 903–912.
0 0 0 0 aðl q1ÞM þ 1 0 0 0 0 [18] N. Takahashi, A new sufficient condition for complete stability of cellular
B C neural networks with delay, IEEE Trans. Circuits Syst. I: Fundam. Theory Appl.
B 0 0 0 0 al 0 0 0 0C
B C l
LOlq ¼ B ðq1ÞM þ 2
CLO 47 (2000) 793–799.
B^ ^ ^ ^ ^ ^ ^ ^ ^C [19] X.M. Li, L.H. Huang, J.H. Wu, Further results on the stability of delayed
@ A cellular neural networks, IEEE Trans. Circuits Syst. I: Fundam. Theory Appl. 50
0 0 0 0 alqM 0 0 0 0 (2003) 1239–1242.
0 1 0 1 [20] X.X. Liao, J. Wang, Z.G. Zeng, Global asymptotic stability and global expo-
LOl1 LO1 nential stability of delayed cellular neural networks, IEEE Trans. Circuits Syst.
B C B C
B LOl C B LO2 C II: Express Briefs 52 (2005) 403–409.
B 2 C
¼B CLO ¼ B
B ^ C,
C [21] X.X. Liao, Q. Luo, Z.G. Zeng, Y.X. Guo, Global exponential stability in lagrange
B ^ C @ A sense for recurrent neural networks with time delays, Nonlinear Anal. Real
@ A
World Appl. 9 (2008) 1535–1557.
LOlN LOm [22] C.D. Zheng, H.G. Zhang, Z.S. Wang, New delay-dependent global exponential
stability criterion for cellular-type neural networks with time-varying delays,
IEEE Trans. Circuits Syst. II: Express Briefs 56 (2009) 250–254.
O ¼ OLO, [23] S.P. Xiao, X.M. Zhang, New globally asymptotic stability criteria for delayed
cellular neural networks, IEEE Trans. Circuits Syst. II: Express Briefs 56 (2009)
0
^ ¼ ðOR , OR , . . . , OR , ONM þ R , . . . O T 659–663.
O 1 2 k3 1 lnNM þ Rk . . . OmnNM þ Rk3 Þ ,
[24] C.D. Zheng, H.G. Zhang, Z.S. Wang, Improved robust stability criteria for
delayed cellular neural networks via the LMI approach, IEEE Trans. Circuits
00
^ ¼ ðOR , OR , . . . , OR , ONM þ R , . . . O T Syst. II: Express Briefs 57 (2010) 41–45.
O 1 2 k3 1 lnNM þ Rk . . . OmnNM þ Rk3 Þ ,
[25] W.H. Chen, W.X. Zheng, A new method for complete stability analysis of
cellular neural networks with time delay, IEEE Trans Neural Networks 21
D ¼ L0 G0 , (2010) 1126–1139.
[26] Z.G. Zeng, J. Wang, X.X. Liao, Stability analysis of delayed cellular neural
^ ¼ ðDR , DR , . . . , DR , DNM þ R , . . . D T networks described using cloning templates, IEEE Trans. Circuits Syst. I:
D 1 2 k3 1 lnNM þ Rk . . . DmnNM þ Rk3 Þ , Regular Papers 51 (2004) 2313–2324.
200 Q. Han et al. / Neurocomputing 97 (2012) 192–200
[27] Z.G. Zeng, J. Wang, Complete stability of cellular neural networks with time- Jun Peng received a PhD in Computer Software and
varying delays, IEEE Trans. Circuits Syst. I: Regular Papers 53 (2006) Theory from Chongqing University in 2003, a MA in
944–955. computer system architecture from Chongqing Uni-
[28] A. Knoblauch, Neural associative memory with optimal Bayesian learning, versity in 2000, and a BSc in Applied Mathematics
Neural Comput. 23 (2011) 1393–1451. from the Northeast University in 1992. From 1992 to
present he works at Chongqing University of Science
and Technology. He was a visiting scholar in the
Laboratory of Cryptography and Information Security
Qi Han received the BS degree in Computer Science at Tsukuba University, Japan in 2004, and Department
and Technology from Shandong University (Weihai), of Computer Science at California State University,
China, in 2005. He received MS degree in Computer Sacramento in 2007, respectively. He has authored or
Software and Theory from Chongqing University, coauthored over 50 peer reviewed journal or confer-
China, in 2009. He received PhD degree in Computer ence papers. He has served as the program committee
Science and Technology at Chongqing University of member or session co-chair for over 10 international conferences. His current
China in 2012. His current research interest covers research interests are on cryptography, chaos and network security.
chaos control and synchronization, cellular automata,
neural network, associative memories.