4.5 Article

Fast-DENSER: Fast Deep Evolutionary Network Structured Representation

期刊

SOFTWAREX
卷 14, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.softx.2021.100694

关键词

Artificial Neural Networks; Automated machine learning; NeuroEvolution

资金

  1. FCT - Foundation for Science and Technology, I.P. [CISUC-UID/CEC/00326/2020, SFRH/BD/114865/2016]
  2. European Social Fund, through the Regional Operational Program Centro 2020
  3. Fundação para a Ciência e a Tecnologia [SFRH/BD/114865/2016] Funding Source: FCT

向作者/读者索取更多资源

This paper introduces a grammar-based general purpose framework for the automatic search and deployment of potentially Deep Artificial Neural Networks (DANNs). Fast-DENSER optimizes the topology, learning strategy, and required hyper-parameters simultaneously for tasks like object recognition. The code is developed and tested in Python3, and a simple example for the automatic search of CNNs for the Fashion-MNIST benchmark is provided.
This paper introduces a grammar-based general purpose framework for the automatic search and deployment of potentially Deep Artificial Neural Networks (DANNs). The approach is known as Fast Deep Evolutionary Network Structured Representation (Fast-DENSER) and is capable of simultaneously optimising the topology, learning strategy and any other required hyper-parameters (e.g., data pre-processing or augmentation). Fast-DENSER has been successfully applied to numerous object recognition tasks, with the generation of Convolutional Neural Networks (CNNs). The code is developed and tested in Python3, and made available as a library. A simple and easy to follow example is described for the automatic search of CNNs for the Fashion-MNIST benchmark. (C) 2021 The Authors. Published by Elsevier B.V.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据