4.6 Article

Words, Tweets, and Reviews: Leveraging Affective Knowledge Between Multiple Domains

Journal

COGNITIVE COMPUTATION
Volume 14, Issue 1, Pages 388-406

Publisher

SPRINGER
DOI: 10.1007/s12559-021-09923-9

Keywords

Transfer learning; Sentiment analysis; Affect in language

Funding

  1. ANID FONDECYT [11200290]
  2. U-Inicia VID Project [UI-004/20]
  3. ANID - Millennium Science Initiative Program [ICN17_002]

Ask authors/readers for more resources

This paper discusses three popular application domains of sentiment and emotion analysis: automatic rating of movie reviews, extracting opinions and emotions on Twitter, and inferring sentiment and emotion associations of words. The study proposes a method for transferring affective knowledge between words, tweets, and movie reviews using Word2Vec static embeddings and BERT contextualized embeddings, and finds that affective knowledge transfer is successful among the three domains.
Three popular application domains of sentiment and emotion analysis are: 1) the automatic rating of movie reviews, 2) extracting opinions and emotions on Twitter, and 3) inferring sentiment and emotion associations of words. The textual elements of these domains differ in their length, i.e., movie reviews are usually longer than tweets and words are obviously shorter than tweets, but they also share the property that they can be plausibly annotated according to the same affective categories (e.g., positive, negative, anger, joy). Moreover, state-of-the-art models for these domains are all based on the approach of training supervised machine learning models on manually annotated examples. This approach suffers from an important bottleneck: Manually annotated examples are expensive and time-consuming to obtain and not always available. In this paper, we propose a method for transferring affective knowledge between words, tweets, and movie reviews using two representation techniques: Word2Vec static embeddings and BERT contextualized embeddings. We build compatible representations for movie reviews, tweets, and words, using these techniques, and train and evaluate supervised models on all combinations of source and target domains. Our experimental results show that affective knowledge can be successfully transferred between our three domains, that contextualized embeddings tend to outperform their static counterparts, and that better transfer learning results are obtained when the source domain has longer textual units than the target domain.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available