Professional Documents
Culture Documents
Cao 1996
Cao 1996
A Note on Stability of Analog Neural oscillations, are considered and analyzed. The networks' learning
Networks with Time Delays property affected by the time delays has also been studied in [4].
Marcus and Westervelt have investigated the case in which C , =
Y. J. Cao and Q. H. Wu C , r3 = T , and T,, are symmetric in (1). By rescaling time, delay
and Tt3,the new variables, t' = L, RC r' = 2 f?C and J,, = RT,,,
are obtained. Neglecting ' nithout losing the generality, linearizing
Abstract-This note presents a generalized sufficient condition which .f,?('U,( t - 7)around the eqnilibrium gives
guarantees stability of analog neural networks with time delays. The con-
dition is derived using a Lyapunov functional and the stability criterion
is stated as: the equilibrium of analog neural networks with delays is duC= -Uz
- + C11,J,,u,(t -
N
T)
Based on the above lemma, let us consider the continuous Hopfield Furthermore, based on the Cauchy inequality, we have
neural networks with time delays. Assuming C, = C and R, = R,
( I ) can be rewritten as uT(t).Jrl i ,Jll4t)lll l J l l 2 I l 4 t . ~ ) l / (12)
(E;”=, u : ( t r t ) ) i .Therefore
where i l u ( t , ~ ) l=
l -
ll~ol=
then Xo = J Y ” .
Xo’Xo = X o T J Y o , llXol12 5 llXolj IIcJ1121/YoIJ and
l llf3(Xy)llI o,//X~llI LllX,”ll,therefore
IlYOll I ,~I/X0II, I/X01I2
I 9/1~J11211X01I2
An example is given as follows to show the above corollary:
.J = [--I
0 -1 -1
0
--1 -1
-:I.
which results in PIlJlln 2 1, contradicting the assumption.
The eigenvalues of matrix ,T ,are X i , 2 = k,
A 3 = -1. According to
the theorem, if 13 < 1, then the equilibrium of system (8) i s unique
Thus llXoll = 0, Xo = 0 and the origin is the unique
and asymptotically stable with respect to arbitrary delays.
equilibrium of system (8).
The equilibrium of system (8) i s asymptotically stable.
Let d = ( $ 1 , 4 ~’ >
’ @,v)’, a(.) = r 2 and V functional be
i
111. CONCLUSION
N This note presents a generalized sufficient condition which guar-
antees stability of analog neural networks with time delays. The
stability criterion can be described as follows: the equilibrium of
analog neural networks with time delays is globally asymptotically
a(.) tends to +m as t + a, and obviously u(lld(0)Il) 5
stable provided that the product of the norm of connection matrix and
V ( 4 ) .Differentiating
the V functional with respect to (8), we
the maximum neuronal gain is less than one. The condition provides
have
a handy assessment on the stability of neural networks with time
delays and can be used to design stable analog neural networks in
practical applications.
REFERENCES
C. M. Marcus and R. M. Westervelt, “Stability of analog neural networks
AV N
with delay,” Phys. Rev. A, vol. 39, no. 2, pp. 347-359, 1989.
J. S. Denker, “Neural network for computing,” in Proc. ConJ Neural
Networks f o r Comput., Snowbird, VT, 1986.
C. M. Marcus, F. R. Waugh, and R. M. Westervelt. “Nonlinear dynamics
and stability of analog neural networks,” Physica D , vol. 51, pp.
In (101, let rl = f ~ ( u ~- (TtI ) ) , f ~ ( w (-tT Z ) ) : ~ . , ~ , V ( U . , V ( ~ - 234-247, 1991.
T , v ) ) ) ~ . and u ( t ) = ( 7 , , ( t ) , 7 ~ 2 ( t ) , . , . , ~ ~ ( t ) ) ‘ , then the second P. Baldi and A. F. Atiya, “How delays affect neural dynamics and
term in the right-hand side of (10) becomes v’(t).l~.The neural learning,” IEEE Truns. Neural Networks, vol. 5, pp. 612-621, July 1994.
network (8) will be unconditionally stable if u T ( t ) J q is negative. L. Wang, E. E. Pichler, and J. Ross, “Oscillations and chaos in neural
Suppose it is positive, linearlized ,fA based on the neuronal gain, fit networks: An exactly solvable model,” in Proc. Nul. Academy Scz. USA,
vol. 87, Dec. 1990, pp. 94ti7-9471.
and using 8 instead of B, we have J. J. Hopfield, “Neural networks and physics systems with emergent
collective computation abilities,” in Proc. Nut. Academy Sci. USA, vol.
79, no. 2, 1982, pp. 2554-2558.
J. J. Hopfield and D. W. Tank, “Neural computation of decisions
optimization problems,” B i d . Cybern., vol. 52, pp. 141-152, 1985.
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL I , NO 6, NOVhMBER 1996 1.535
ISJ J. J. Hopfield, “Neurons with graded response have collective computa- denoted Ph. These notations will be subsequently assumed. The
tional properties like those of two-state neurons,” in Proc. Nat. Academy Bayesian approach is known to be able to detect new classes, but
Sci. USA, vol. X I , no. I O , 1984, pp. 3088-3092. this will not be debated in the present letter. Also note that the true
191 U. Van der Heide, Analysis ofNeuruZ Nercvorks. New York: Springer-
Vcrlag, 1980. classes are not assumed to be disjoint, so that the ideal classifier may
I IO] J. K. Hale, “Nonlinear oscillations in equations with delays,” in Nonlin- have a nonzero misclassificairion rate (it does not bear overfitting).
ear 0.scillntions ir7 Biology, Lecture Notes in Applied Mathematics, vol. In the classification context, the quadratic error minimisation
17, F. C. Hoppensteadt, Ed. Providence, RI: Amer. Math. Soc., 1979. (QEM) criterion consists of minimizing over the learning set a gap
[ 1 1 I S. S. Wang, B. S. Chen, and T. P. Lin, “Robust stability of unccrtain
time-delay systems,” It~r.J. Conrr., vol. 46, no. 3, pp. 963-976, 1987. between desired responses and the outputs of the parameterized
[ 121 E. Niehur, H. G. Schustcr, and D. M. Kammcn, “Collective frequencies mapping, <I,(CV,.)
and metastability in networks of limit-cycle oscillators with time delay,”
Phys. Rev. Lett., vol. 67, no. 20, pp. 2753-2756, 1991.
I131 J . K. Hale, lntroducrion to Fuizctiorzal Diferenliul Equalion.r. New
York: Springer-Verlag, 1993.
The output space is assumed to be provided with a norm, and it is
assumed throughout that T == m“. Many neural networks dedicated
to classification are proceeding this way, and the numerous algorithms
proposed in the literature actually aim at reaching the same goal.
The matter presented in this letter has been already published
Ultimate Performance of QEM Classifiers
in a French conference 151. At the same time, results related to
Pierre Comon and Georges Bienvenu asymptotical performance of the multilayer perceptron (MLP) have
been independently published in this journal [lo]. One can also note
that historically, asymptotical performance of the MLP has also been
Abstract-Supervised learning of classifiers often resorts to the min- derived earlier [I], but the proof relied heavily on the numerical
imization of a quadratic error, even if this criterion is more especially algorithm utilked. It has been established in [9] that probabilities
matched to nonlinear regression problems. It is shown that the mapping of misclassifications are minimized when data samples are infinite
built by a quadratic error minimization (QEM) tends to output the
Bayesian discriminating rules even with nonuniform losses, provided the and when losses are uniforjm. The scope of the paper is to show
desired responses are chosen accordingly. This property is for instance that similar results hold true for nonuniform losses, and for ,finite
shared by the multilayer perceptron (MLP). It is shown that their ultimate databases when noisy replicates are fed infinitly many times in
performance can be assessed with finite learning sets by establishing links the network. The statements presented are valid for general QEM
with kernel estimators of density.
classifiers independently o f t h e exact form of the learning algorithm.