4.5 Article

Attentive convolutional gated recurrent network: a contextual model to sentiment analysis

Journal

Publisher

SPRINGER HEIDELBERG
DOI: 10.1007/s13042-020-01135-1

Keywords

Sentiment analysis; Convolutional neural network; Recurrent neural network; Attention mechanism; Contextual features

Funding

  1. National Key Research and Development Program of China [2016YFB0800402, 2016QY01W0202]
  2. National Natural Science Foundation of China [U1836204, U1936108, 61433006, U1401258, 61502185]

Ask authors/readers for more resources

Considering contextual features is a key issue in sentiment analysis. Existing approaches including convolutional neural networks (CNNs) and recurrent neural networks (RNNs) lack the ability to account and prioritize informative contextual features that are necessary for better sentiment interpretation. CNNs present limited capability since they are required to be very deep, which can lead to the gradient vanishing whereas, RNNs fail because they sequentially process input sequences. Furthermore, the two approaches treat all words equally. In this paper, we suggest a novel approach named attentive convolutional gated recurrent network (ACGRN) that alleviates the above issues for sentiment analysis. The motivation behind ACGRN is to avoid the vanishing gradient caused by deep CNN via applying a shallow-and-wide CNN that learns local contextual features. Afterwards, to solve the problem caused by the sequential structure of RNN and prioritizing informative contextual information, we use a novel prior knowledge attention based bidirectional gated recurrent unit (ATBiGRU). Prior knowledge ATBiGRU captures global contextual features with a strong focus on the previous hidden states that carry more valuable information to the current time step. The experimental results show that ACGRN significantly outperforms the baseline models over six small and large real-world datasets for the sentiment classification task.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available