4.7 Article

Quantum-inspired evolutionary algorithm applied to neural architecture search

Journal

APPLIED SOFT COMPUTING
Volume 120, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.asoc.2022.108674

Keywords

Quantum-inspired algorithms; Neural architecture search; Deep learning; Convolutional networks

Funding

  1. CNPq
  2. FAPERJ
  3. CAPES

Ask authors/readers for more resources

This paper introduces the Q-NAS algorithm, which can automatically generate network architectures that outperform hand-designed models on CIFAR-10 and CIFAR-100. Compared to other neural architecture search methods, Q-NAS achieves a promising balance between performance, runtime efficiency, and automation.
The success of machine learning models over the last few years is mostly related to the significant progress of deep neural networks. These powerful and flexible models can even surpass human-level performance in tasks such as image recognition and strategy games. However, experts need to spend considerable time and resources to design the network structure. The demand for new architectures drives interest in automating this design process. Researchers have proposed new algorithms to address the neural architecture search (NAS) problem, including efforts to reduce the high computational cost of such methods. A common approach to improve efficiency is to reduce the search space with the help of expert knowledge, searching for cells rather than entire networks. Motivated by the faster convergence promoted by quantum-inspired evolutionary methods, the Q-NAS algorithm was proposed to address the NAS problem without relying on cell search. In this work, we consolidate Q-NAS, adding a new penalization feature, enhancing its retraining scheme, and also investigating more challenging search spaces than before. In CIFAR-10, we reached 93.85% of test accuracy in 67 GPU days, considering the addition of an early-stopping mechanism. We also applied Q-NAS to CIFAR-100, without modifying the parameters, and our best accuracy was 74.23%, which is comparable to ResNet164. The enhancements and results presented in this work show that Q-NAS can automatically generate network architectures that outperform hand-designed models for CIFAR-10 and CIFAR-100. Also, compared to other NAS methods, Q-NAS results are promising regarding the balance between performance, runtime efficiency, and automation. We believe that our results enrich the discussion on this balance, considering alternatives to the cell search approach. (C) 2022 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available