4.7 Article

AS-NAS: Adaptive Scalable Neural Architecture Search With Reinforced Evolutionary Algorithm for Deep Learning

期刊

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TEVC.2021.3061466

关键词

Deep learning; I-Ching divination evolutionary algorithm (IDEA); neural architecture search (NAS); reinforced operator controller; variable-architecture encoding

资金

  1. National Key Research and Development Program of China [2019YFB1703600, 2019YFA0706200]
  2. National Natural Science Foundation of China [62076102, U1813203, U1801262, 62006081]
  3. National Natural Science Foundation of Guangdong for Distinguished Young Scholar [2020B1515020041]
  4. Science and Technology Major Project of Guangzhou [202007030006]
  5. Science and Technology Program of Guangzhou [202002030250]
  6. China Postdoctoral Science Foundation [2020M672630]
  7. Guangdong-Hong Kong-Macao Greater Bay Area Center for Brain Science and Brain-Inspired Intelligence Fund [2019016]

向作者/读者索取更多资源

The AS-NAS method proposes an adaptive and scalable NAS approach that enhances search efficiency and achieves scalability of deep neural architecture through a simplified RL algorithm and variable-architecture encoding strategy.
Neural architecture search (NAS) is a challenging problem in the design of deep learning due to its nonconvexity. To address this problem, an adaptive scalable NAS method (AS-NAS) is proposed based on the reinforced I-Ching divination evolutionary algorithm (IDEA) and variable-architecture encoding strategy. First, unlike the typical reinforcement learning (RL)-based and evolutionary algorithm (EA)-based NAS methods, a simplified RL algorithm is developed and used as the reinforced operator controller to adaptively select the efficient operators of IDEA. Without the complex actor-critic parts, the reinforced IDEA based on simplified RL can enhance the search efficiency of the original EA with lower computational cost. Second, a variable-architecture encoding strategy is proposed to encode neural architecture as a fixed-length binary string. By simultaneously considering variable layers, channels, and connections between different convolution layers, the deep neural architecture can be scalable. Through the integration with the reinforced IDEA and variable-architecture encoding strategy, the design of the deep neural architecture can be adaptively scalable. Finally, the proposed AS-NAS are integrated with the L-1/2 regularization to increase the sparsity of the optimized neural architecture. Experiments and comparisons demonstrate the effectiveness and superiority of the proposed method.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据