4.7 Article

From Zhang Neural Network to Newton Iteration for Matrix Inversion

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCSI.2008.2007065

Keywords

Activation function; initial state; matrix inversion; Newton iteration; recurrent neural network (RNN); step size

Funding

  1. National Science Foundation of China [60643004, 60775050]

Ask authors/readers for more resources

Different from gradient-based neural networks, a special kind of recurrent neural network (RNN) has recently been proposed by Zhang et al. for online matrix inversion. Such an RNN is designed based on a matrix-valued error function instead of a scalar-valued error function. In addition, it was depicted in an implicit dynamics instead of an explicit dynamics. In this paper, we develop and investigate a discrete-time model of Zhang neural network (termed as such and abbreviated as ZNN for presentation convenience), which is depicted by a system of difference equations. Comparing with Newton iteration for matrix inversion, we find that the discrete-time ZNN model incorporates Newton iteration as its special case. Noticing this relation, we perform numerical comparisons on different situations of using ZNN and Newton iteration for matrix inversion. Different kinds of activation functions and different step-size values are examined for superior convergence and better stability of ZNN. Numerical examples demonstrate the efficacy of both ZNN and Newton iteration for online matrix inversion.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available