4.5 Article

Exploration of block-wise dynamic sparseness

相关参考文献

注意:仅列出部分参考文献,下载原文获取全部文献信息。
Article Computer Science, Artificial Intelligence

Improving knowledge distillation using unified ensembles of specialized teachers

Adamantios Zaras et al.

Summary: This study introduces a method to efficiently transfer knowledge to smaller models by using multiple highly specialized teachers and diversifying the ensemble to overcome limitations of the distillation process and achieve high distillation efficiency. The proposed method demonstrates improved distillation performance on three different image datasets, even outperforming powerful state-of-the-art ensemble-based distillation methods.

PATTERN RECOGNITION LETTERS (2021)

Article Computer Science, Artificial Intelligence

Principal component analysis with tensor train subspace

Wenqi Wang et al.

PATTERN RECOGNITION LETTERS (2019)

Article Computer Science, Artificial Intelligence

BDNN: Binary convolution neural networks for fast object detection

Hanyu Peng et al.

PATTERN RECOGNITION LETTERS (2019)

Article Computer Science, Artificial Intelligence

Margin-based ordered aggregation for ensemble pruning

Li Guo et al.

PATTERN RECOGNITION LETTERS (2013)