3.8 Proceedings Paper

Novelty Driven Evolutionary Neural Architecture Search

出版社

ASSOC COMPUTING MACHINERY
DOI: 10.1145/3520304.3528889

关键词

Neural architecture search; supernet; novelty search; multi-objective optimization

资金

  1. Ministry of Science and Technology of Taiwan [MOST 110-2628-E-A49-012-, MOST 110-2634-F-A49-006-, MOST 111-2420-H-369-001-]

向作者/读者索取更多资源

This paper proposes a method called NEvoNAS, which formulates the NAS problem as a multi-objective problem and effectively avoids local optimum traps by maintaining a diverse set of solutions and using a supernet for architecture evaluation.
Evolutionary algorithms (EA) based neural architecture search (NAS) involves evaluating each architecture by training it from scratch, which is extremely time-consuming. This can be reduced by using a supernet for estimating the fitness of an architecture due to weight sharing among all architectures in the search space. However, the estimated fitness is very noisy due to the co-adaptation of the operations in the supernet which results in NAS methods getting trapped in local optimum. In this paper, we propose a method called NEvoNAS wherein the NAS problem is posed as a multi-objective problem with 2 objectives: (i) maximize architecture novelty, (ii) maximize architecture fitness/accuracy. The novelty search is used for maintaining a diverse set of solutions at each generation which helps avoiding local optimum traps while the architecture fitness is calculated using supernet. NSGA-II is used for finding the pareto optimal front for the NAS problem and the best architecture in the pareto front is returned as the searched architecture. Exerimentally, NEvoNAS gives better results on 2 different search spaces while using significantly less computational resources as compared to previous EA-based methods. The code for our paper can be found here.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据