4.7 Article

One-Shot Neural Architecture Search by Dynamically Pruning Supernet in Hierarchical Order

期刊

出版社

WORLD SCIENTIFIC PUBL CO PTE LTD
DOI: 10.1142/S0129065721500295

关键词

Neural architecture search; one-shot NAS; hierarchically-ordered pruning

资金

  1. Natural Science Foundation of China [62025601, 61772353]

向作者/读者索取更多资源

Neural Architecture Search (NAS) has attracted growing research interest in automatically designing neural architectures. One-shot NAS methods improve search efficiency by training one supernet that synthesizes all candidate architectures, but face challenges in accuracy. This paper proposes a Hierarchically-Ordered Pruning NAS algorithm to dynamically prune the supernet based on proper pruning direction, achieving competitive results on CIFAR10 and ImageNet.
Neural Architecture Search (NAS), which aims at automatically designing neural architectures, recently draw a growing research interest. Different from conventional NAS methods, in which a large number of neural architectures need to be trained for evaluation, the one-shot NAS methods only have to train one supernet which synthesizes all the possible candidate architectures. As a result, the search efficiency could be significantly improved by sharing the supernet's weights during the candidate architectures' evaluation. This strategy could greatly speed up the search process but suffer a challenge that the evaluation based on sharing weights is not predictive enough. Recently, pruning the supernet during the search has been proven to be an efficient way to alleviate this problem. However, the pruning direction in complex-structured search space remains unexplored. In this paper, we revisited the role of path dropout strategy, which drops the neural operations instead of the neurons, in supernet training, and several interesting characters of the supernet trained with dropout are found. Based on the observations, a Hierarchically-Ordered Pruning Neural Architecture Search (HOPNAS) algorithm is proposed by dynamically pruning the supernet with a proper pruning direction. Experimental results indicate that our method is competitive with state-of-the-art approaches on CIFAR10 and ImageNet.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据