4.7 Article

Knowledge Distillation: A Survey

期刊

INTERNATIONAL JOURNAL OF COMPUTER VISION
卷 129, 期 6, 页码 1789-1819

出版社

SPRINGER
DOI: 10.1007/s11263-021-01453-z

关键词

Deep neural networks; Model compression; Knowledge distillation; Knowledge transfer; Teacher– student architecture

资金

  1. Australian Research Council [FL-170100117, IH-180100002, IC-190100 031]
  2. National Natural Science Foundation of China [61976107]

向作者/读者索取更多资源

This paper provides a comprehensive survey of knowledge distillation, covering knowledge categories, training schemes, teacher-student architecture, distillation algorithms, performance comparison, and applications. It also briefly reviews challenges in knowledge distillation and discusses future research directions.
In recent years, deep neural networks have been successful in both industry and academia, especially for computer vision tasks. The great success of deep learning is mainly due to its scalability to encode large-scale data and to maneuver billions of model parameters. However, it is a challenge to deploy these cumbersome deep models on devices with limited resources, e.g., mobile phones and embedded devices, not only because of the high computational complexity but also the large storage requirements. To this end, a variety of model compression and acceleration techniques have been developed. As a representative type of model compression and acceleration, knowledge distillation effectively learns a small student model from a large teacher model. It has received rapid increasing attention from the community. This paper provides a comprehensive survey of knowledge distillation from the perspectives of knowledge categories, training schemes, teacher-student architecture, distillation algorithms, performance comparison and applications. Furthermore, challenges in knowledge distillation are briefly reviewed and comments on future research are discussed and forwarded.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据