4.6 Article

MixedNet: Network Design Strategies for Cost-Effective Quantized CNNs

Journal

IEEE ACCESS
Volume 9, Issue -, Pages 117554-117564

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2021.3106658

Keywords

Quantization (signal); Convolution; Network architecture; Hardware; Degradation; Convolutional neural networks; System-on-chip; Convolutional neural network; deep neural network; memory access number; memory cost; on-chip memory size; quantized neural networks

Ask authors/readers for more resources

This paper introduces a design strategy for low-cost quantized neural networks, utilizing a large number of channels and incorporating a squeeze-and-excitation layer to maintain performance. Through analysis and simulations, a low-cost layer design strategy is proposed and a MixedNet network is successfully built, achieving high classification accuracy rates.
This paper proposes design strategies for a low-cost quantized neural network. To prevent the classification accuracy from being degraded by quantization, a structure-design strategy that utilizes a large number of channels rather than deep layers is proposed. In addition, a squeeze-and-excitation (SE) layer is adopted to enhance the performance of the quantized network. Through a quantitative analysis and simulations of the quantized key convolution layers of ResNet and MobileNets, a low-cost layer-design strategy for use when building a neural network is proposed. With this strategy, a low-cost network referred to as a MixedNet is constructed. A 4-bit quantized MixedNet example achieves an on-chip memory size reduction of 60% and fewer memory access by 53% with negligible classification accuracy degradation in comparison with conventional networks while also showing classification accuracy rates of approximately 73% for Cifar-100 and 93% for Cifar-10.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available