4.7 Article

Representation and compression of Residual Neural Networks through a multilayer network based approach

Journal

EXPERT SYSTEMS WITH APPLICATIONS
Volume 215, Issue -, Pages -

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.eswa.2022.119391

Keywords

Residual Neural Networks; Convolutional Neural Networks; Complex networks; Multilayer networks; Compression algorithm; Convolutional layer pruning

Ask authors/readers for more resources

In this paper, a multilayer network approach for representing and compressing ResNet is proposed. It can identify redundant convolutional layers and prune them, resulting in a new compressed ResNet. Experimental results demonstrate the suitability and effectiveness of the proposed approach.
In recent years different types of Residual Neural Networks (ResNets, for short) have been introduced to improve the performance of deep Convolutional Neural Networks. To cope with the possible redundancy of the layer structure of ResNets and to use them on devices with limited computational capabilities, several tools for exploring and compressing such networks have been proposed. In this paper, we provide a contribution in this setting. In particular, we propose an approach for the representation and compression of a ResNet based on the use of a multilayer network. This is a structure sufficiently powerful to represent and manipulate a ResNet, as well as other families of deep neural networks. Our compression approach uses a multilayer network to represent a ResNet and to identify the possible redundant convolutional layers belonging to it. Once such layers are identified, it prunes them and some related ones obtaining a new compressed ResNet. Experimental results demonstrate the suitability and effectiveness of the proposed approach.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available