4.7 Article

Evolving neural networks through augmenting topologies

期刊

EVOLUTIONARY COMPUTATION
卷 10, 期 2, 页码 99-127

出版社

MIT PRESS
DOI: 10.1162/106365602320169811

关键词

genetic algorithms; neural networks; neuroevolution; network topologies; speciation; competing conventions

向作者/读者索取更多资源

An important question in neuroevolution is how to gain an advantage from evolving neural network topologies along with weights. We present a method, NeuroEvolution of Augmenting Topologies (NEAT), which outperforms the best fixed-topology method on a challenging benchmark reinforcement learning, task. We claim that the increased efficiency is due to (1) employing a principled method of crossover of different topologies, (2) protecting structural innovation using speciation, and (3) incrementally growing from minimal structure. We test this claim through a series of ablation studies that demonstrate that each component is necessary to the system as a whole and to each other. What results is significantly faster learning. NEAT is also an important contribution to GAs because it shows how it is possible for evolution to both optimize mid complexify solutions simultaneously, offering the possibility of evolving increasingly complex solutions over generations, and strengthening, the analogy with biological evolution.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据