4.6 Article

Global exponential stability in Lagrange sense for neutral type recurrent neural networks

Journal

NEUROCOMPUTING
Volume 74, Issue 4, Pages 638-645

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2010.10.001

Keywords

Recurrent neural networks; Lagrange stability; Global exponential attractivity; Delays

Funding

  1. Natural Science Foundations of China [60874110, 60974021]

Ask authors/readers for more resources

In this paper, the global exponential stability in Lagrange sense for continuous neutral type recurrent neural networks (NRNNs) with multiple time delays is studied. Three different types of activation functions are considered, including general bounded and two types of sigmoid activation functions. By constructing appropriate Lyapunov functions, some easily verifiable criteria for the ultimate boundedness and global exponential attractivity of NRNNs are obtained. These results can be applied to monostable and multistable neural networks as well as chaos control and chaos synchronization. (c) 2010 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available