Journal
NEUROCOMPUTING
Volume 42, Issue -, Pages 87-117Publisher
ELSEVIER
DOI: 10.1016/S0925-2312(01)00596-3
Keywords
neural networks; evolution strategy; backpropagation; learning; gradient search; weight space
Categories
Ask authors/readers for more resources
This report investigates evolution strategies (ESs, a subclass of evolutionary algorithms) as an alternative to gradient-based neural network training. Based on an empirical comparison of population- and gradient-based search, we derive hints for parameterization and draw conclusions about the usefulness of evolution strategies for this purpose. We will see that ESs can only compete with gradient-based search in the case of small problems and that ESs are good for training neural networks with a non-differentiable activation function. Insights into how evolution strategies behave in search spaces generated by neural networks are offered. Here, we see that for this class of objective functions, the dimensionality of the problem is critical. With increasing numbers of decision variables, the learning becomes more and more difficult for ESs, and an efficient parameterization becomes crucial. (C) 2002 Elsevier Science B.V. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available