4.6 Article

Examining Attention Mechanisms in Deep Learning Models for Sentiment Analysis

期刊

APPLIED SCIENCES-BASEL
卷 11, 期 9, 页码 -

出版社

MDPI
DOI: 10.3390/app11093883

关键词

attention mechanism; deep neural networks; global-attention; hierarchical-attention; self-attention; sentiment analysis

向作者/读者索取更多资源

Attention-based methods have gained increased interest in recent years as they can enhance neural network performance in various tasks. This study focuses on attention-based models in sentiment analysis, comparing them with baseline models in text sentiment classification tasks and showing up to a 3.5% improvement in accuracy.
Attention-based methods for deep neural networks constitute a technique that has attracted increased interest in recent years. Attention mechanisms can focus on important parts of a sequence and, as a result, enhance the performance of neural networks in a variety of tasks, including sentiment analysis, emotion recognition, machine translation and speech recognition. In this work, we study attention-based models built on recurrent neural networks (RNNs) and examine their performance in various contexts of sentiment analysis. Self-attention, global-attention and hierarchical-attention methods are examined under various deep neural models, training methods and hyperparameters. Even though attention mechanisms are a powerful recent concept in the field of deep learning, their exact effectiveness in sentiment analysis is yet to be thoroughly assessed. A comparative analysis is performed in a text sentiment classification task where baseline models are compared with and without the use of attention for every experiment. The experimental study additionally examines the proposed models' ability in recognizing opinions and emotions in movie reviews. The results indicate that attention-based models lead to great improvements in the performance of deep neural models showcasing up to a 3.5% improvement in their accuracy.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据