4.6 Article

A convergence-accelerated Zhang neural network and its solution application to Lyapunov equation

期刊

NEUROCOMPUTING
卷 193, 期 -, 页码 213-218

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2016.02.021

关键词

Nonlinear activation function; Finite-time convergence; Lyapunov equation; Gradient neural network; Zhang neural network

资金

  1. National Natural Science Foundation of China [61503152, 61563017, 61561022, 61363073, 61363033]
  2. Research Foundation of Education Bureau of Hunan Province, China [15B192, 15C1119]
  3. Research Foundation of Jishou University, China [2015SYJG034, JDLF2015013]

向作者/读者索取更多资源

Lyapunov equation is widely encountered in scientific and engineering fields, and especially used in the control community to analyze the stability of a control system. In this paper, a convergence-accelerated Zhang neural network (CAZNN) is proposed and investigated for solving online Lyapunov equation. Different from the conventional gradient neural network (GNN) and the original Zhang neural network (ZNN), the proposed CAZNN model adopts a sign-bi-power activation function, and thus possesses the best convergence performance. Furthermore, we prove that the CAZNN model can converge to the theoretical solution of Lyapunov equation within finite time, instead of converging exponentially with time. Simulative results also verify the effectiveness and superiority of the CAZNN model for solving online Lyapunov equation, as compared with the GNN model and the ZNN model. (C) 2016 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据