4.6 Article

Improving text classification with weighted word embeddings via a multi-channel TextCNN model

Journal

NEUROCOMPUTING
Volume 363, Issue -, Pages 366-374

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2019.07.052

Keywords

Text classification; Term weighting; Word embedding; Convolutional neural network; Term frequency-inverse document frequency (TF-IDF)

Funding

  1. National Natural Science Foundation of China [11671317, 61877049]

Ask authors/readers for more resources

In recent years, convolutional neural networks (CNNs) have gained considerable attention in text classification because of the remarkable good performance they achieved in various situations. The usual practice is to first perform word embedding (i.e., mapping each word into a word vector), and then employ a CNN to perform classification. To improve classification accuracy, term weighting approaches have been proven to be quite effective. But to the best of our knowledge, almost all these methods assign only one weight to each term (word). Considering the fact that one term generally has different importance in documents with different class labels, we propose in this paper a novel term weighting scheme to be combined with word embeddings to enhance the classification performance of CNNs. In the novel method, multiple weights are assigned to each term and these weights are applied to the word embeddings of the words separately. Subsequently, the transformed features are fed into a multi-channel CNN model to predict the label of the sentence. By comparing the novel method with several other baseline methods with five benchmark data sets, the results manifest that the classification accuracy of the proposed method exceeds that of other methods by an amazing margin. Moreover, the weights assigned by different weighting schemes are also analyzed to get more insights of their working mechanism. (C) 2019 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available