Journal
ADVANCES IN DIFFERENCE EQUATIONS
Volume 2019, Issue 1, Pages -Publisher
SPRINGER
DOI: 10.1186/s13662-019-2443-3
Keywords
Recurrent neural networks (RNNs); Exponential stability; Graph theory; Young's inequality; Discrete time-varying delays; Infinite distributed time-varying delays
Categories
Funding
- National Natural Science Foundation of China [11971076, 51839002]
- RUSA-Phase 2.0 Grant, Policy (TN Multi-Gen), Dept. of Edn. Govt. of India [F 24-51/2014-U]
- UGC-SAP (DRS-I) Grant [F.510/8/DRS-I/2016(SAP-I)]
- DST (FIST-level I) Grant [657876570, SR/FIST/MS-I/2018/17]
- Prince Sultan University through research group Nonlinear Analysis Methods in Applied Mathematics (NAMAM) [RG-DES-2017-01-17]
- Thailand Research Grant Fund [RSA6280004]
Ask authors/readers for more resources
In this work, the exponential stability problem of impulsive recurrent neural networks is investigated; discrete time delay, continuously distributed delay and stochastic noise are simultaneously taken into consideration. In order to guarantee the exponential stability of our considered recurrent neural networks, two distinct types of sufficient conditions are derived on the basis of the Lyapunov functional and coefficient of our given system and also to construct a Lyapunov function for a large scale system a novel graph-theoretic approach is considered, which is derived by utilizing the Lyapunov functional as well as graph theory. In this approach a global Lyapunov functional is constructed which is more related to the topological structure of the given system. We present a numerical example and simulation figures to show the effectiveness of our proposed work.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available