Journal
ADVANCES IN DIFFERENCE EQUATIONS
Volume 2021, Issue 1, Pages -Publisher
SPRINGER
DOI: 10.1186/s13662-021-03415-8
Keywords
Clifford-valued neural network; Exponential stability; Lyapunov functional; Lagrange stability
Categories
Funding
- Rajamangala University of Technology Suvarnabhumi, Thailand
Ask authors/readers for more resources
This paper investigates the Clifford-valued recurrent neural network models and their global exponential stability in the Lagrange sense. By dividing the original Clifford-valued RNN model into real-valued models, sufficient conditions for achieving global exponential stability are obtained using Lyapunov stability theory and analytical techniques. Two examples are provided to illustrate the results, along with a discussion on their implications.
This paper considers the Clifford-valued recurrent neural network (RNN) models, as an augmentation of real-valued, complex-valued, and quaternion-valued neural network models, and investigates their global exponential stability in the Lagrange sense. In order to address the issue of non-commutative multiplication with respect to Clifford numbers, we divide the original n-dimensional Clifford-valued RNN model into 2mn real-valued models. On the basis of Lyapunov stability theory and some analytical techniques, several sufficient conditions are obtained for the considered Clifford-valued RNN models to achieve global exponential stability according to the Lagrange sense. Two examples are presented to illustrate the applicability of the main results, along with a discussion on the implications.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available