Journal
NEUROCOMPUTING
Volume 308, Issue -, Pages 49-57Publisher
ELSEVIER
DOI: 10.1016/j.neucom.2018.04.045
Keywords
Sentiment classification; LSTM; Neural networks; Sentence vectors
Categories
Funding
- National Natural Science Foundation of China (NSFC) [61373165, 61672377, 61373035]
Ask authors/readers for more resources
Recently, due to their ability to deal with sequences of different lengths, neural networks have achieved a great success on sentiment classification. It is widely used on sentiment classification. Especially long short-term memory networks. However, one of the remaining challenges is to model long texts to exploit the semantic relations between sentences in document-level sentiment classification. Existing Neural network models are not powerful enough to capture enough sentiment messages from relatively long timesteps. To address this problem, we propose a new neural network model (SR-LSTM) with two hidden layers. The first layer learns sentence vectors to represent semantics of sentences with long short term memory network, and in the second layer, the relations of sentences are encoded in document representation. Further, we also propose an approach to improve it which first clean datasets and remove sentences with less emotional polarity in datasets to have a better input for our model. The proposed models outperform the state-of-the-art models on three publicly available document-level review datasets. (C) 2018 Elsevier B.V. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available