3.8 Proceedings Paper

Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data

相关参考文献

注意:仅列出部分参考文献,下载原文获取全部文献信息。
Proceedings Paper Acoustics

ROBUSTNESS AND DIVERSITY SEEKING DATA-FREE KNOWLEDGE DISTILLATION

Pengchao Han et al.

Summary: The proposed method RDSKD aims to enhance accuracy and robustness by generating samples with high authenticity, class diversity, and inter-sample diversity. By optimizing the generator loss function, conflicts between sample diversity and authenticity are mitigated, leading to improved model performance compared to other data-free KD methods.

2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021) (2021)

Proceedings Paper Computer Science, Artificial Intelligence

Effectiveness of Arbitrary Transfer Sets for Data-free Knowledge Distillation

Gaurav Kumar Nayak et al.

Summary: Knowledge Distillation is an effective method for transferring learning across deep neural networks. The study shows that using arbitrary data for knowledge distillation can have surprising effectiveness, especially when the dataset is balanced, which may potentially lead to the design of baselines for data-free knowledge distillation.

2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2021) (2021)

Article Multidisciplinary Sciences

Overcoming catastrophic forgetting in neural networks

James Kirkpatricka et al.

PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA (2017)

Proceedings Paper Computer Science, Artificial Intelligence

Quo Vadis, Action Recognition? A New Model and the Kinetics Dataset

Joao Carreira et al.

30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017) (2017)