Journal
NEURAL PROCESSING LETTERS
Volume 42, Issue 3, Pages 763-784Publisher
SPRINGER
DOI: 10.1007/s11063-014-9397-y
Keywords
Finite-time stability; The convergence time; conservative; Sylvester equation; Recurrent neural network
Categories
Funding
- National Science Foundation of China [61374028, 61273183, 51177088]
- National Science Foundation of Hubei Provincial [2013CFA05 0]
- Graduate Scientific Research Foundation of China Three Gorges University [2014PY069]
Ask authors/readers for more resources
This paper investigates finite-time stability and its application for solving time-varying Sylvester equation by recurrent neural network. Firstly, a new finite-time stability criterion is given and a less conservative upper bound of the convergence time is also derived. Secondly, a sign-bi-power activation function with a linear term is presented for the recurrent neural network. The estimation of the upper bound of the convergence time is more less conservative. Thirdly, it is proposed a tunable activation function with three tunable positive parameters for the recurrent neural network. These parameters are not only helpful to reduce conservatism of the upper bound of the convergence time, accelerate convergence but also reduce sensitivity to additive noise. The effectiveness of our methods is shown by both theoretical analysis and numerical simulations.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available