4.5 Article

New results on global exponential stability of recurrent neural networks with time-varying delays

Journal

PHYSICS LETTERS A
Volume 352, Issue 4-5, Pages 371-379

Publisher

ELSEVIER SCIENCE BV
DOI: 10.1016/j.physleta.2005.12.031

Keywords

exponential stability; linear matrix inequality; recurrent neural networks; time-varying delays

Ask authors/readers for more resources

This Letter provides new sufficient conditions for the existence, uniqueness and global exponential stability of the equilibrium point of recurrent neural networks with time-varying delays by employing Lyapunov functions and using the Halanay inequality. The time-varying delays are not necessarily differentiable. Both Lipschitz continuous activation functions and monotone nondecreasing activation functions are considered. The derived stability criteria are expressed in terms of linear matrix inequalities (LMIs), which can be checked easily by resorting to recently developed algorithms solving LMIs. Furthermore, the proposed stability results are less conservative than some previous ones in the literature, which is demonstrated via some numerical examples. (c) 2005 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available