3.8 Proceedings Paper

Explicit Model Size Control and Relaxation via Smooth Regularization for Mixed-Precision Quantization

Journal

COMPUTER VISION, ECCV 2022, PT XII
Volume 13672, Issue -, Pages 1-16

Publisher

SPRINGER INTERNATIONAL PUBLISHING AG
DOI: 10.1007/978-3-031-19775-8_1

Keywords

Neural network quantization; Mixed-precision quantization; Regularization for quantization

Ask authors/readers for more resources

While deep neural network quantization reduces computational and storage costs, it also leads to a drop in model accuracy. To overcome this, using different quantization bit-widths for different layers is a possible solution. In this study, a novel technique for explicit complexity control of mixed-precision quantized DNNs is introduced, which utilizes smooth optimization and can be applied to any neural network architecture.
While Deep Neural Networks (DNNs) quantization leads to a significant reduction in computational and storage costs, it reduces model capacity and therefore, usually leads to an accuracy drop. One of the possible ways to overcome this issue is to use different quantization bit-widths for different layers. The main challenge of the mixed-precision approach is to define the bit-widths for each layer, while staying under memory and latency requirements. Motivated by this challenge, we introduce a novel technique for explicit complexity control of DNNs quantized to mixed-precision, which uses smooth optimization on the surface containing neural networks of constant size. Furthermore, we introduce a family of smooth quantization regularizers, which can be used jointly with our complexity control method for both post-training mixed-precision quantization and quantization-aware training. Our approach can be applied to any neural network architecture. Experiments show that the proposed techniques reach state-of-the-art results.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available