4.6 Article

Generalized Lyapunov approach for convergence of neural networks with discontinuous or non-Lipschitz activations

期刊

PHYSICA D-NONLINEAR PHENOMENA
卷 214, 期 1, 页码 88-99

出版社

ELSEVIER SCIENCE BV
DOI: 10.1016/j.physd.2005.12.006

关键词

discontinuous neural networks; global exponential stability; convergence in finite time; Lyapunov approach; generalized gradient; M-matrices and H-matrices

向作者/读者索取更多资源

The paper considers a class of additive neural networks where the neuron activations are modeled by discontinuous functions or by continuous non-Lipschitz functions. Some tools are developed which enable us to apply a Lyapunov-like approach to differential equations with discontinuous right-hand side modeling the neural network dynamics. The tools include a chain rule for computing the time derivative along the neural network solutions of a nondifferentiable Lyapunov function, and a comparison principle for this time derivative, which yields conditions for exponential convergence or convergence in finite time. By means of the Lyapunov-like approach, a general result is proved on global exponential convergence toward a unique equilibrium point of the neural network solutions. Moreover, new results on global convergence in finite time are established, which are applicable to neuron activations with jump discontinuities, or neuron activations modeled by means of continuous (non-Lipschitz) Holder functions. (c) 2005 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据