4.2 Article

SP-DARTS: Synchronous Progressive Differentiable Neural Architecture Search for Image Classification

期刊

IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS
卷 E104D, 期 8, 页码 1232-1238

出版社

IEICE-INST ELECTRONICS INFORMATION COMMUNICATION ENGINEERS
DOI: 10.1587/transinf.2020BDP0009

关键词

NAS; DARTS; skip operation; synchronous progressive; image classification

资金

  1. National Key Research and Development Plan of China [2017YFB1400301]

向作者/读者索取更多资源

This paper proposes a synchronous progressive approach to solve the discontinuity problem of network depth and width, using 0-1 loss function to alleviate the discontinuity issue and reduce computational overhead.
Differentiable neural architecture search (DARTS) is now a widely disseminated weight-sharing neural architecture search method and it consists of two stages: search and evaluation. However, the original DARTS suffers from some well-known shortcomings. Firstly, the width and depth of the network, as well as the operation of two stages are discontinuous, which causes a performance collapse. Secondly, DARTS has a high computational overhead. In this paper, we propose a synchronous progressive approach to solve the discontinuity problem for network depth and width and we use the 0-1 loss function to alleviate the discontinuity problem caused by the discretization of operation. The computational overhead is reduced by using the partial channel connection. Besides, we also discuss and propose a solution to the aggregation of skip operations during the search process of DARTS. We conduct extensive experiments on CIFAR-10 andWANFANG datasets, specifically, our approach reduces search time significantly (from 1.5 to 0.1 GPU days) and improves the accuracy of image recognition.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.2
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据