4.7 Article

Knowledge Distillation: A Survey

Journal

INTERNATIONAL JOURNAL OF COMPUTER VISION
Volume 129, Issue 6, Pages 1789-1819

Publisher

SPRINGER
DOI: 10.1007/s11263-021-01453-z

Keywords

Deep neural networks; Model compression; Knowledge distillation; Knowledge transfer; Teacher– student architecture

Funding

  1. Australian Research Council [FL-170100117, IH-180100002, IC-190100 031]
  2. National Natural Science Foundation of China [61976107]

Ask authors/readers for more resources

This paper provides a comprehensive survey of knowledge distillation, covering knowledge categories, training schemes, teacher-student architecture, distillation algorithms, performance comparison, and applications. It also briefly reviews challenges in knowledge distillation and discusses future research directions.
In recent years, deep neural networks have been successful in both industry and academia, especially for computer vision tasks. The great success of deep learning is mainly due to its scalability to encode large-scale data and to maneuver billions of model parameters. However, it is a challenge to deploy these cumbersome deep models on devices with limited resources, e.g., mobile phones and embedded devices, not only because of the high computational complexity but also the large storage requirements. To this end, a variety of model compression and acceleration techniques have been developed. As a representative type of model compression and acceleration, knowledge distillation effectively learns a small student model from a large teacher model. It has received rapid increasing attention from the community. This paper provides a comprehensive survey of knowledge distillation from the perspectives of knowledge categories, training schemes, teacher-student architecture, distillation algorithms, performance comparison and applications. Furthermore, challenges in knowledge distillation are briefly reviewed and comments on future research are discussed and forwarded.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available