4.5 Article

Convolutional neural network pruning based on misclassification cost

Journal

JOURNAL OF SUPERCOMPUTING
Volume -, Issue -, Pages -

Publisher

SPRINGER
DOI: 10.1007/s11227-023-05487-7

Keywords

Convolutional neural networks; Pruning; Misclassification cost

Ask authors/readers for more resources

In a convolutional neural network, pruning parameters can resolve the challenges of overparameterization, overfitting, slow inference, and impediments in edge computing, leading to high speed and low accuracy loss.
In a convolutional neural network (CNN), overparameterization increases the risk of overfitting, decelerates the inference, and impedes edge computing. To resolve these challenges, one possible solution is to prune CNN parameters. The essence of pruning is to identify and eliminate unimportant filters, which should yield the highest speed increase and the lowest accuracy loss. In contrast with other pruning methods and in conformity with the real-world, this paper does not evaluate the accuracy of a CNN as its overall performance but analyzes different misclassification costs. This modification accelerates the pruning process and improves the prune ratio. The proposed algorithm determines the expected specificity/sensitivity for each class and finds the smallest CNN that is consistent with them. The layer-wise relevance propagation is employed to measure the contribution of each filter to every class discrimination. The importance of each filter is determined by integrating its local (usefulness in its layer) and global (contribution to the network output) usefulness. Since the proposed algorithm frequently fluctuates between pruning and recovery, further fine-tuning is unnecessary. According to simulation results, the proposed algorithm was efficient in both pruning a CNN and attaining the desired sensitivity/specificity of classes.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available