4.6 Article

Generalized Lyapunov approach for convergence of neural networks with discontinuous or non-Lipschitz activations

Journal

PHYSICA D-NONLINEAR PHENOMENA
Volume 214, Issue 1, Pages 88-99

Publisher

ELSEVIER SCIENCE BV
DOI: 10.1016/j.physd.2005.12.006

Keywords

discontinuous neural networks; global exponential stability; convergence in finite time; Lyapunov approach; generalized gradient; M-matrices and H-matrices

Ask authors/readers for more resources

The paper considers a class of additive neural networks where the neuron activations are modeled by discontinuous functions or by continuous non-Lipschitz functions. Some tools are developed which enable us to apply a Lyapunov-like approach to differential equations with discontinuous right-hand side modeling the neural network dynamics. The tools include a chain rule for computing the time derivative along the neural network solutions of a nondifferentiable Lyapunov function, and a comparison principle for this time derivative, which yields conditions for exponential convergence or convergence in finite time. By means of the Lyapunov-like approach, a general result is proved on global exponential convergence toward a unique equilibrium point of the neural network solutions. Moreover, new results on global convergence in finite time are established, which are applicable to neuron activations with jump discontinuities, or neuron activations modeled by means of continuous (non-Lipschitz) Holder functions. (c) 2005 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available