4.7 Article

Customizing a teacher for feature distillation

相关参考文献

注意:仅列出部分参考文献,下载原文获取全部文献信息。
Article Computer Science, Artificial Intelligence

Improving knowledge distillation via an expressive teacher

Chao Tan et al.

Summary: Knowledge distillation is a network compression technique where a teacher network guides a student network to mimic its behavior. This study explores how to train as a good teacher, proposing inter-class correlation regularization. Experimental results show that this method achieves good performance in image classification tasks.

KNOWLEDGE-BASED SYSTEMS (2021)

Article Computer Science, Artificial Intelligence

ImageNet Large Scale Visual Recognition Challenge

Olga Russakovsky et al.

INTERNATIONAL JOURNAL OF COMPUTER VISION (2015)

Article Computer Science, Artificial Intelligence

The PASCAL Visual Object Classes Challenge: A Retrospective

Mark Everingham et al.

INTERNATIONAL JOURNAL OF COMPUTER VISION (2015)