4.6 Article

Global output convergence of a class of continuous-time recurrent neural networks with time-varying thresholds

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCSII.2004.824041

Keywords

global output convergence; Lipschitz continuity; Lyapunov diagonal semistability; neural networks; time-varying threshold

Ask authors/readers for more resources

This paper discusses the global output convergence of a class of continuous-time recurrent neural networks (RNNS) with globally Lipschitz continuous and monotone nondecreasing activation functions and locally Lipschitz continuous time-varying thresholds. We establish one sufficient condition to guarantee the global output convergence of this. class of neural networks. The present result does not require symmetry in the connection weight matrix. The convergence result is useful in the design of recurrent neural networks with time-varying thresholds.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available