4.7 Article

Teacher-student knowledge distillation based on decomposed deep feature representation for intelligent mobile applications

期刊

EXPERT SYSTEMS WITH APPLICATIONS
卷 202, 期 -, 页码 -

出版社

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.eswa.2022.117474

关键词

Knowledge distillation; Deep features representation; Convolutional neural network; Lightweight classification; Mobile intelligence

向作者/读者索取更多资源

This paper studies feature-based knowledge distillation and proposes a method to improve student model performance and enhance knowledge comprehension by decomposing and distilling the knowledge from the inner layers of a teacher model to the inner layers of a student model.
According to the recent studies on feature-based knowledge distillation (KD), a student model will not be able to imitate a teacher's behavior properly if there is a high variance between the inner layers of the teacher and those of the student in terms of spatial shapes. This paper proposes a hypothesis that breaking down the knowledge of feature maps from a teacher's inner layers and then distilling this knowledge into a student's inner layers can bridge the gap between an advanced teacher and a student. Improving a student's performance, this process can also help the student model better comprehend the knowledge. Hence, this paper embeds feature-based KD modules between a teacher model and a student model. In addition to extracting a tensor of feature maps in a teacher's inner layers, these modules are responsible for breaking down this high-dimensional tensor through high-order rank singular value decomposition and then distilling the useful knowledge from the teacher's feature maps into the student. According to various evaluations on two benchmark datasets in Experimental Results and Paired t-Test, adding the tensor decomposition approach to the feature-based KD module had a major role in enhancing the performance of a student model which showed competitive outputs in comparison with the stateof-the-art methods.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据