4.7 Article

Global exponential convergence and stability of gradient-based neural network for online matrix inversion

Journal

APPLIED MATHEMATICS AND COMPUTATION
Volume 215, Issue 3, Pages 1301-1306

Publisher

ELSEVIER SCIENCE INC
DOI: 10.1016/j.amc.2009.06.048

Keywords

Gradient-based neural network; Online matrix inversion; Lyapunov stability theory; Asymptotical convergence; Global exponential convergence

Funding

  1. National Science Foundation of China [60643004]
  2. Science and Technology Office of Sun Yat-Sen University

Ask authors/readers for more resources

Wang proposed a gradient-based neural network (GNN) to solve online matrix-inverses. Global asymptotical convergence was shown for such a neural network when applied to inverting nonsingular matrices. As compared to the previously-presented asymptotical convergence, this paper investigates more desirable properties of the gradient-based neural network; e. g., global exponential convergence for nonsingular matrix inversion, and global stability even for the singular-matrix case. Illustrative simulation results further demonstrate the theoretical analysis of gradient-based neural network for online matrix inversion. (C) 2009 Elsevier Inc. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available