Journal
PHYSICA D-NONLINEAR PHENOMENA
Volume 214, Issue 1, Pages 88-99Publisher
ELSEVIER SCIENCE BV
DOI: 10.1016/j.physd.2005.12.006
Keywords
discontinuous neural networks; global exponential stability; convergence in finite time; Lyapunov approach; generalized gradient; M-matrices and H-matrices
Ask authors/readers for more resources
The paper considers a class of additive neural networks where the neuron activations are modeled by discontinuous functions or by continuous non-Lipschitz functions. Some tools are developed which enable us to apply a Lyapunov-like approach to differential equations with discontinuous right-hand side modeling the neural network dynamics. The tools include a chain rule for computing the time derivative along the neural network solutions of a nondifferentiable Lyapunov function, and a comparison principle for this time derivative, which yields conditions for exponential convergence or convergence in finite time. By means of the Lyapunov-like approach, a general result is proved on global exponential convergence toward a unique equilibrium point of the neural network solutions. Moreover, new results on global convergence in finite time are established, which are applicable to neuron activations with jump discontinuities, or neuron activations modeled by means of continuous (non-Lipschitz) Holder functions. (c) 2005 Elsevier B.V. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available