4.7 Article

Shallow and deep neural network training by water wave optimization

期刊

SWARM AND EVOLUTIONARY COMPUTATION
卷 50, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.swevo.2019.100561

关键词

Artificial neural networks (ANNs); Deep neural networks (DNNs); Evolutionary neural networks; Structure and parameter optimization; Water wave optimization (WWO)

资金

  1. National Natural Science Foundation [61872123, 61473263]
  2. Zhejiang Provincial Natural Science Foundation [LY18F030023]

向作者/读者索取更多资源

It is well known that the performance of artificial neural networks (ANNs) is significantly affected by their structure design and parameter selection, for which traditional training methods have drawbacks such as long training times, over-fitting, and premature convergence. Evolutionary algorithms (EAs) have provided an effective tool for ANN parameter optimization. However, simultaneously optimizing ANN structures and parameters remains a difficult problem. In this study, we adapt water wave optimization (WWO), a relatively new EA, for optimizing both the parameters and structures of ANNs, including classical shallow ANNs and deep neural networks (DNNs). We use a variable-dimensional solution encoding to represent both the structure and parameters of an ANN, and adapt WWO propagation, refraction, and breaking operators to efficiently evolve variable-dimensional solutions to solve the complex network optimization problems. Computational experiments on a variety of benchmark datasets show that the WWO algorithm achieves a very competitive performance compared to other popular gradient-based algorithms and EAs.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据