4.7 Article

A novel adaptive learning deep belief network based on automatic growing and pruning algorithms

Journal

APPLIED SOFT COMPUTING
Volume 104, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.asoc.2021.107248

Keywords

Deep learning; Deep belief network; Information entropy; Normal distribution; Adaptive learning

Funding

  1. National Natural Science foundation of China [62076110, 61673193, 62072216]
  2. Natural Science Foundation of Jiangsu Province, China [BK20181341]
  3. Postdoctoral Science Foundation of China [2017M621625]

Ask authors/readers for more resources

This study proposed a novel adaptive learning deep belief network (ALDBN) that dynamically adjusts its structure for feature extraction using a series of growing and pruning algorithms. It also revealed the relationships between network depth, information entropy, and weight distribution, while providing theoretical proof for convergence and comparing performance with other methods on benchmark datasets. The results show that ALDBN outperforms competitors in terms of accuracy on various tests.
In this study, a novel adaptive learning deep belief network (ALDBN) with a series of growing and pruning algorithms is proposed to dynamically adjust its structure when ALDBN is utilized for extracting features. Specifically, a neuron growing algorithm is designed considering the individual and macroscopical impacts on each neuron to detect unstable hidden neurons, and a new hidden neuron will be added around each unstable neuron to compensate for the inadequacy of the local structure for feature extraction. Moreover, the relations of network depth and information entropy with respect to the normal distribution of each weight between hidden layers are revealed. On basis of the relations revealed, a layer growing algorithm is designed considering the obedience rate of the normal distribution to control the number of hidden layers. In addition, a neuron pruning algorithm using the standard deviation of neuron activation probability is integrated in ALDBN to prune the redundant neurons with low discriminative ability. We first give the theoretical proof for the convergence of ALDBN, which is crucial to its stability. To exhibit its performance, parameter sensitivity analysis is then provided to investigate the effects of two key parameters in ALDBN. Finally, we compare ALDBN with five state-of-art methods on three benchmark datasets, and the comparative experimental results demonstrate that ALDBN outperforms the other five competitors in terms of the accuracies of common test, cross-validated test and holdout test. (c) 2021 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available