4.4 Article

Exponential stability in the Lagrange sense for Clifford-valued recurrent neural networks with time delays

Journal

ADVANCES IN DIFFERENCE EQUATIONS
Volume 2021, Issue 1, Pages -

Publisher

SPRINGER
DOI: 10.1186/s13662-021-03415-8

Keywords

Clifford-valued neural network; Exponential stability; Lyapunov functional; Lagrange stability

Funding

  1. Rajamangala University of Technology Suvarnabhumi, Thailand

Ask authors/readers for more resources

This paper investigates the Clifford-valued recurrent neural network models and their global exponential stability in the Lagrange sense. By dividing the original Clifford-valued RNN model into real-valued models, sufficient conditions for achieving global exponential stability are obtained using Lyapunov stability theory and analytical techniques. Two examples are provided to illustrate the results, along with a discussion on their implications.
This paper considers the Clifford-valued recurrent neural network (RNN) models, as an augmentation of real-valued, complex-valued, and quaternion-valued neural network models, and investigates their global exponential stability in the Lagrange sense. In order to address the issue of non-commutative multiplication with respect to Clifford numbers, we divide the original n-dimensional Clifford-valued RNN model into 2mn real-valued models. On the basis of Lyapunov stability theory and some analytical techniques, several sufficient conditions are obtained for the considered Clifford-valued RNN models to achieve global exponential stability according to the Lagrange sense. Two examples are presented to illustrate the applicability of the main results, along with a discussion on the implications.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.4
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available