4.6 Article

Towards Model Compression for Deep Learning Based Speech Enhancement

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TASLP.2021.3082282

Keywords

Speech enhancement; Tensors; Image coding; Quantization (signal); Training; Pipelines; Sensitivity analysis; Model compression; sparse regularization; pruning; quantization; speech enhancement

Funding

  1. National Institute on Deafness and Other Communication Disorders (NIDCD) [R01 DC012048]
  2. Ohio Supercomputer Center

Ask authors/readers for more resources

The study proposes two compression pipelines to reduce the size of DNN-based speech enhancement models using sparse regularization, iterative pruning, and clustering-based quantization. Experimental results show that this approach reduces model sizes while maintaining enhancement performance, especially excelling in speaker separation tasks.
The use of deep neural networks (DNNs) has dramatically elevated the performance of speech enhancement over the last decade. However, to achieve strong enhancement performance typically requires a large DNN, which is both memory and computation consuming, making it difficult to deploy such speech enhancement systems on devices with limited hardware resources or in applications with strict latency requirements. In this study, we propose two compression pipelines to reduce the model size for DNN-based speech enhancement, which incorporates three different techniques: sparse regularization, iterative pruning and clustering-based quantization. We systematically investigate these techniques and evaluate the proposed compression pipelines. Experimental results demonstrate that our approach reduces the sizes of four different models by large margins without significantly sacrificing their enhancement performance. In addition, we find that the proposed approach performs well on speaker separation, which further demonstrates the effectiveness of the approach for compressing speech separation models.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available