3.8 Proceedings Paper

ROBUSTNESS AND DIVERSITY SEEKING DATA-FREE KNOWLEDGE DISTILLATION

出版社

IEEE
DOI: 10.1109/ICASSP39728.2021.9414674

关键词

knowledge distillation; data-free distillation; generative model; diversity; robustness

资金

  1. National Natural Science Foundation of China [61775033]
  2. Chongqing Municipal Education Commission [KJQN201900647]

向作者/读者索取更多资源

The proposed method RDSKD aims to enhance accuracy and robustness by generating samples with high authenticity, class diversity, and inter-sample diversity. By optimizing the generator loss function, conflicts between sample diversity and authenticity are mitigated, leading to improved model performance compared to other data-free KD methods.
Knowledge distillation (KD) has enabled remarkable progress in model compression and knowledge transfer. However, KD requires a large volume of original data or their representation statistics that are not usually available in practice. Data-free KD has recently been proposed to resolve this problem, wherein teacher and student models are fed by a synthetic sample generator trained from the teacher. Nonetheless, existing data-free KD methods rely on fine-tuning of weights to balance multiple losses, and ignore the diversity of generated samples, resulting in limited accuracy and robustness. To overcome this challenge, we propose robustness and diversity seeking data-free KD (RDSKD) in this paper. The generator loss function is crafted to produce samples with high authenticity, class diversity, and inter-sample diversity. Without real data, the objectives of seeking high sample authenticity and class diversity often conflict with each other, causing frequent loss fluctuations. We mitigate this by exponentially penalizing loss increments. With MNIST, CIFAR-10, and SVHN datasets, our experiments show that RDSKD achieves higher accuracy with more robustness over different hyperparameter settings, compared to other data-free KD methods such as DAFL, MSKD, ZSKD, and DeepInversion.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据