4.6 Article

Weak and strong convergence analysis of Elman neural networks via weight decay regularization

期刊

OPTIMIZATION
卷 72, 期 9, 页码 2287-2309

出版社

TAYLOR & FRANCIS LTD
DOI: 10.1080/02331934.2022.2057852

关键词

Elman neural networks; L-2 regularization; gradient method; convergence

向作者/读者索取更多资源

In this paper, a novel variant of the algorithm is proposed to improve the generalization performance for Elman neural networks. By controlling the weight growth and preventing over-fitting, rigorous theoretical analysis and experimental verification have been conducted.
In this paper, we propose a novel variant of the algorithm to improve the generalization performance for Elman neural networks (ENN). Here, the weight decay term, also called L-2 regularization, which can effectively control the value of weights excessive growth, also over-fitting phenomenon can be effectively prevented. The main contribution of this work lies in that we have conducted a rigorous theoretical analysis of the proposed approach, i.e. the weak and strong convergence results are obtained. The comparison experiments to the problems of function approximation and classification on the real-world data have been performed to verify the theoretical results.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据