4.6 Article

A comparison of evolution strategies and backpropagation for neural network training

Journal

NEUROCOMPUTING
Volume 42, Issue -, Pages 87-117

Publisher

ELSEVIER
DOI: 10.1016/S0925-2312(01)00596-3

Keywords

neural networks; evolution strategy; backpropagation; learning; gradient search; weight space

Ask authors/readers for more resources

This report investigates evolution strategies (ESs, a subclass of evolutionary algorithms) as an alternative to gradient-based neural network training. Based on an empirical comparison of population- and gradient-based search, we derive hints for parameterization and draw conclusions about the usefulness of evolution strategies for this purpose. We will see that ESs can only compete with gradient-based search in the case of small problems and that ESs are good for training neural networks with a non-differentiable activation function. Insights into how evolution strategies behave in search spaces generated by neural networks are offered. Here, we see that for this class of objective functions, the dimensionality of the problem is critical. With increasing numbers of decision variables, the learning becomes more and more difficult for ESs, and an efficient parameterization becomes crucial. (C) 2002 Elsevier Science B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available