4.6 Article

Meta learning evolutionary artificial neural networks

期刊

NEUROCOMPUTING
卷 56, 期 -, 页码 1-38

出版社

ELSEVIER
DOI: 10.1016/S0925-2312(03)00369-2

关键词

global optimization; local search; evolutionary algorithm and meta-learning

向作者/读者索取更多资源

In this paper, we present meta-learning evolutionary artificial neural network (MLEANN), an automatic computational framework for the adaptive optimization of artificial neural networks (ANNs) wherein the neural network architecture, activation function, connection weights; learning algorithm and its parameters are adapted according to the problem. We explored the performance of MLEANN and conventionally designed ANNs for function approximation problems. To evaluate the comparative performance, we used three different well-known chaotic time series. We also present the state-of-the-art popular neural network learning algorithms and some experimentation results related to convergence speed and generalization performance. We explored the performance of backpropagation algorithm; conjugate gradient algorithm, quasi-Newton algorithm and Levenberg-Marquardt algorithm for the three chaotic time series. Performances of the different learning algorithms were evaluated when the activation functions and architecture were changed. We further present the theoretical background, algorithm, design strategy and further demonstrate how effective and inevitable is the proposed MLEANN framework to design a neural network, which is smaller, faster and with a better generalization performance. (C) 2003 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据