4.1 Article

Global asymptotic stability of recurrent neural networks with multiple time-varying delays

Journal

IEEE TRANSACTIONS ON NEURAL NETWORKS
Volume 19, Issue 5, Pages 855-873

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNN.2007.912319

Keywords

recurrent neural networks; global asymptotic stability; multiple time-varying delays; linear matrix inequality (LMI); Lyapunov-Krasovskii functional

Funding

  1. National Natural Science Foundation of China [60534010, 60572070, 60521003, 60774048, 60728307]
  2. National High Technology Research and Development Program of China [2006AA04ZI83]
  3. Program for Changjiang Scholars and Innovative Research Groups of China [60521003]

Ask authors/readers for more resources

In this paper, several sufficient conditions are established for the global asymptotic stability of recurrent neural networks with multiple time-varying delays. The Lyapunov-Krasovskii stability theory for functional differential equations and the linear matrix inequality (LMI) approach are employed in our investigation. The results are shown to be generalizations of some previously published results and are less conservative than existing results. The present results are also applied to recurrent neural networks with constant time delays.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.1
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available