4.6 Article

A comparison of evolution strategies and backpropagation for neural network training

期刊

NEUROCOMPUTING
卷 42, 期 -, 页码 87-117

出版社

ELSEVIER
DOI: 10.1016/S0925-2312(01)00596-3

关键词

neural networks; evolution strategy; backpropagation; learning; gradient search; weight space

向作者/读者索取更多资源

This report investigates evolution strategies (ESs, a subclass of evolutionary algorithms) as an alternative to gradient-based neural network training. Based on an empirical comparison of population- and gradient-based search, we derive hints for parameterization and draw conclusions about the usefulness of evolution strategies for this purpose. We will see that ESs can only compete with gradient-based search in the case of small problems and that ESs are good for training neural networks with a non-differentiable activation function. Insights into how evolution strategies behave in search spaces generated by neural networks are offered. Here, we see that for this class of objective functions, the dimensionality of the problem is critical. With increasing numbers of decision variables, the learning becomes more and more difficult for ESs, and an efficient parameterization becomes crucial. (C) 2002 Elsevier Science B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据