4.6 Article

Cyclic Transfer Learning for Mandarin-English Code-Switching Speech Recognition

Journal

IEEE SIGNAL PROCESSING LETTERS
Volume 30, Issue -, Pages 1387-1391

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/LSP.2023.3307350

Keywords

Task analysis; Speech recognition; Training; Transfer learning; Speech coding; Decoding; Transformers; code-switching speech recognition; transfer learning; cyclic transfer learning

Ask authors/readers for more resources

In this study, a cyclic transfer learning method (CTL) is proposed to improve the model's performance on the target task by utilizing code-switching and monolingual speech resources as pretext tasks. The model is alternately learned among these tasks, allowing the preservation of code-switching features for knowledge transfer. Experimental results on the SEAME Mandarin-English code-switching corpus show that the CTL approach achieves the best performance compared to other methods, with significant relative MER reduction on the test sets.
Transfer learning is a common method to improve the performance of the model on a target task via pre-training the model on pretext tasks. Different from the methods using mono-lingual corpora for pre-training, in this study, we propose a Cyclic Transfer Learning method (CTL) that utilizes both code-switching(CS) and monolingual speech resources as the pretext tasks. More- over, the model in our approach is always alternately learned among these tasks. This helps our model can improve its performance via maintaining CS features during transferring knowledge. The experiment results on the standard SEAME Mandarin-English CS corpus have shown that our proposed CTL approach achieves the best performance with Mixed Error Rate (MER) of 16.3% on test man, 24.1% on test(sge). In comparison to the baseline model that was pre-trained with monolingual data, our CTL method achieves 11.4% and 8.7% relative MER reduction on the test(man) and test(sge) sets, respectively. Besides, the CTL approach also outperforms compared to other state-of-the-art methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available