4.7 Article

Online knowledge distillation with elastic peer

相关参考文献

注意:仅列出部分参考文献,下载原文获取全部文献信息。
Article Computer Science, Artificial Intelligence

Improving knowledge distillation via an expressive teacher

Chao Tan et al.

Summary: Knowledge distillation is a network compression technique where a teacher network guides a student network to mimic its behavior. This study explores how to train as a good teacher, proposing inter-class correlation regularization. Experimental results show that this method achieves good performance in image classification tasks.

KNOWLEDGE-BASED SYSTEMS (2021)

Article Computer Science, Artificial Intelligence

Neural Compatibility Modeling With Probabilistic Knowledge Distillation

Xianjing Han et al.

IEEE TRANSACTIONS ON IMAGE PROCESSING (2020)

Proceedings Paper Computer Science, Information Systems

Neural Compatibility Modeling with Attentive Knowledge Distillation

Xuemeng Song et al.

ACM/SIGIR PROCEEDINGS 2018 (2018)