4.8 Article

A Recurrent Neural Network With Explicitly Definable Convergence Time for Solving Time-Variant Linear Matrix Equations

期刊

IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS
卷 14, 期 12, 页码 5289-5298

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TII.2018.2817203

关键词

Activation function; fixed-time convergence; time-variant linear matrix equation; zeroing neural network

向作者/读者索取更多资源

Time-variant linear matrix equations (TVLMEs) are ubiquitous in engineering. To solve TVLMEs. various zeroing neural network (ZNN) models have been developed. These ZNNs globally converge to the solution of TVLMEs either in infinity long time or in finite time. However, even the convergence time of a finite-time convergent ZNN is implicit and closely dependent on the initial condition of a problem. This may reduce its applicability to time-critical applications in practice. To overcome this problem, this paper for the first time accelerates a ZNN to fixed-time convergence using a novel activation function. The convergence time of the proposed ZNN can be antecedently defined as an explicit parameter. Theoretically, its fixed-time convergence and robustness properties are rigorously proved. Comparative numerical results substantiate the superior convergence and robustness performance of the fixed-time convergent ZNN for TVLMEs solving. Additionally, the fixed-time convergent ZNN is applied to motion planning of a redundant robotic arm.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据