4.5 Article

Action Command Encoding for Surrogate-Assisted Neural Architecture Search

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCDS.2021.3107555

Keywords

Computer architecture; Encoding; Topology; Optimization; Evolutionary computation; Task analysis; Reinforcement learning; Encoding scheme; neural architecture search; optimizer; performance evaluator

Funding

  1. National Key Research and Development Program of China [2018AAA0100100]
  2. National Natural Science Foundation of China [61822301, 61876123, 61906001]
  3. Hong Kong Scholars Program [XJ2019035]
  4. Anhui Provincial Natural Science Foundation [1908085QF271]
  5. Research Grants Council of the Hong Kong Special Administrative Region, China [PolyU11202418, PolyU11209219]
  6. Royal Society International Exchanges Program [IEC\NSFC\170279]

Ask authors/readers for more resources

This article proposes a novel encoding scheme called ACEncoding for neural architecture search, and introduces a tailored performance evaluator called Seq2Rank. Experimental results show that ACEncoding and Seq2Rank can improve the performance of neural architecture search and achieve competitive results in image classification tasks.
With the development of neural architecture search, the performance of deep neural networks has been considerably enhanced with less human expertise. While the existing work mainly focuses on the development of optimizers, the design of encoding scheme is still in its infancy. This article thus proposes a novel encoding scheme for neural architecture search, termed action command encoding (ACEncoding). Inspired by the gene expression process, ACEncoding defines several action commands to indicate the addition and clone of layers, connections, and local modules, where an architecture grows from empty according to multiple action commands. ACEncoding provides a compact and rich search space that can be explored by various optimizers efficiently. Furthermore, a surrogate-assisted performance evaluator is tailored for ACEncoding, termed sequence-to-rank (Seq2Rank). By integrating the Seq2Seq model with RankNet, Seq2Rank embeds the variable-length encoding of ACEncoding into a continuous space, and then predicts the rankings of architectures based on the continuous representation. In the experiments, ACEncoding brings improvement to neural architecture search with existing encoding schemes and Seq2Rank shows better accuracy than existing performance evaluators. The neural architectures obtained by ACEncoding and Seq2Rank have competitive test errors and complexities on image classification tasks, and also show high transferability between different data sets.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available