4.5 Article

On global asymptotic stability of neural networks with discrete and distributed delays

Journal

PHYSICS LETTERS A
Volume 345, Issue 4-6, Pages 299-308

Publisher

ELSEVIER SCIENCE BV
DOI: 10.1016/j.physleta.2005.07.025

Keywords

neural networks; distributed delays; discrete delays; Lyapunov-Krasovskii functional; global asymptotic stability; linear matrix inequality

Ask authors/readers for more resources

In this Letter, the global asymptotic stability analysis problem is investigated for a class of neural networks with discrete and distributed time-delays. The purpose of the problem is to determine the asymptotic stability by employing some easy-to-test conditions. It is shown, via the Lyapunov-Krasovskii stability theory, that the class of neural networks under consideration is globally asymptotically stable if a quadratic matrix inequality involving several parameters is feasible. Furthermore, a linear matrix inequality (LMI) approach is exploited to transform the addressed stability analysis problem into a convex optimization problem, and sufficient conditions for the neural networks to be globally asymptotically stable are then derived in terms of a linear matrix inequality, which can be readily solved by using the Matlab LMI toolbox. Two numerical examples are provided to show the usefulness of the proposed global stability condition. (c) 2005 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available