4.6 Article

KRR-CNN: kernels redundancy reduction in convolutional neural networks

Journal

NEURAL COMPUTING & APPLICATIONS
Volume 34, Issue 3, Pages 2443-2454

Publisher

SPRINGER LONDON LTD
DOI: 10.1007/s00521-021-06540-3

Keywords

Convolutional neural networks; Convolution kernel; Binary optimization; Genetic algorithm; Image classification

Ask authors/readers for more resources

KRR-CNN, a new optimization model for reducing kernels redundancy in CNN, utilizes an evolutionary genetic algorithm to efficiently reduce redundant kernels and enhance classification performance.
Convolutional neural networks (CNNs) are a promising tool for solving real-world problems. However, successful CNNs often require a large number of parameters, which leads to a significant amount of memory and a higher computational cost. This may produce some undesirable phenomena, notably the overfitting. Indeed, in CNNs, many kernels are usually redundant and can be eliminated from the network while preserving the performance. In this work, we propose a new optimization model for kernels redundancy reduction in CNN named KRR-CNN. It consists of minimization and optimization phases. In the first one, a dataset is used to train a specific CNN generating a learned CNN with optimal parameters. These later are combined with a decision optimization model to reduce kernels that have not contributed to the first task. The optimization phase is carried out by the evolutionary genetic algorithm. Efficiency of KRR-CNN has been demonstrated by several experiments. In fact, the suggested model allows reducing the kernels redundancy and improving the classification performance comparable to the state-of-the-art CNNs.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available