4.7 Article

Hyperparameters optimization of convolutional neural network based on local autonomous competition harmony search algorithm

Journal

JOURNAL OF COMPUTATIONAL DESIGN AND ENGINEERING
Volume 10, Issue 4, Pages 1280-1297

Publisher

OXFORD UNIV PRESS
DOI: 10.1093/jcde/qwad050

Keywords

harmony search algorithm; convolutional neural network; optimization speed; hyperparameters optimization

Ask authors/readers for more resources

This paper proposes a method to automatically optimize CNN hyperparameters based on the local autonomous competitive harmony search (LACHS) algorithm. By using parameter dynamic adjustment strategy, autonomous decision-making search strategy, and local competition mechanism, it effectively improves the performance of CNN and the efficiency of hyperparameter configuration. In addition, the feasibility of LACHS algorithm in configuring CNN hyperparameters is verified through experiments on Fashion-MNIST dataset, CIFAR10 dataset, and expression recognition.
Because of the good performance of convolutional neural network (CNN), it has been extensively used in many fields, such as image, speech, text, etc. However, it is easily affected by hyperparameters. How to effectively configure hyperparameters at a reasonable time to improve the performance of CNNs has always been a complex problem. To solve this problem, this paper proposes a method to automatically optimize CNN hyperparameters based on the local autonomous competitive harmony search (LACHS) algorithm. To avoid the influence of complicated parameter adjustment of LACHS algorithm on its performance, a parameter dynamic adjustment strategy is adopted, which makes the pitch adjustment probability PAR and step factor BW dynamically adjust according to the actual situation. To strengthen the fine search of neighborhood space and reduce the possibility of falling into local optima for a long time, an autonomous decision-making search strategy based on the optimal state is designed. To help the algorithm jump out of the local fitting situation, this paper proposes a local competition mechanism to make the new sound competes with the worst harmonic progression of local selection. In addition, an evaluation function is proposed, which integrates the training times and recognition accuracy. To achieve the purpose of saving the calculation cost without affecting the search result, it makes the training time for each model depending on the learning rate and batch size. In order to prove the feasibility of LACHS algorithm in configuring CNN superparameters, the classification of the Fashion-MNIST dataset and CIFAR10 dataset is tested. The comparison is made between CNN based on empirical configuration and CNN based on classical algorithms to optimize hyperparameters automatically. The results show that the performance of CNN based on the LACHS algorithm has been improved effectively, so this algorithm has certain advantages in hyperparametric optimization. In addition, this paper applies the LACHS algorithm to expression recognition. Experiments show that the performance of CNN optimized based on the LACHS algorithm is better than that of the same type of artificially designed CNN. Therefore, the method proposed in this paper is feasible in practical application.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available