4.5 Article

Global exponential stability in Lagrange sense for recurrent neural networks with both time-varying delays and general activation functions via LMI approach

Journal

NONLINEAR ANALYSIS-REAL WORLD APPLICATIONS
Volume 12, Issue 4, Pages 2174-2182

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.nonrwa.2010.12.031

Keywords

Neural network; Lagrange exponential stability; Globally exponentially attractive set; Halanay delay differential inequality; LMI approach

Funding

  1. Graduate Scientific Research Creative Foundation of China Three Gorges University [200946, 200945]
  2. National Natural Science Foundation, China [61074091, 60974136]
  3. Hubei Provincial Department of Education [T200809]
  4. Hubei Province Natural Science Foundation [2008CDB316, 2008CDZ046]

Ask authors/readers for more resources

In this paper, we study the global exponential stability in a Lagrange sense for recurrent neural networks with both time-varying delays and general activation functions. Based on assuming that the activation functions are neither bounded nor monotonous or differentiable, several algebraic criterions in linear matrix inequality form for the global exponential stability in a Lagrange sense of the neural networks are obtained by virtue of Lyapunov functions and Halanay delay differential inequality. Meanwhile, the estimations of the globally exponentially attractive sets are given out. The results derived here are more general than that of the existing reference. Finally, two examples are given and analyzed to demonstrate our results. (C) 2011 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available