4.6 Article

The Impact of Attention Mechanisms on Speech Emotion Recognition

期刊

SENSORS
卷 21, 期 22, 页码 -

出版社

MDPI
DOI: 10.3390/s21227530

关键词

artificial intelligence; speech emotion recognition; attention mechanism; neural networks

资金

  1. National Natural Science Foundation of China [51905115, 61803109, 51775122]
  2. Science and Technology Planning Project of Guangzhou City [202102020880]

向作者/读者索取更多资源

This paper discusses the applicable rules of Global-Attention and Self-Attention in SER classification construction, and proposes a new classifier model with an accuracy of 85.427% on the EMO-DB dataset.
Speech emotion recognition (SER) plays an important role in real-time applications of human-machine interaction. The Attention Mechanism is widely used to improve the performance of SER. However, the applicable rules of attention mechanism are not deeply discussed. This paper discussed the difference between Global-Attention and Self-Attention and explored their applicable rules to SER classification construction. The experimental results show that the Global-Attention can improve the accuracy of the sequential model, while the Self-Attention can improve the accuracy of the parallel model when conducting the model with the CNN and the LSTM. With this knowledge, a classifier (CNN-LSTMx2+Global-Attention model) for SER is proposed. The experiments result show that it could achieve an accuracy of 85.427% on the EMO-DB dataset.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据