4.7 Article

Back to common sense: Oxford dictionary descriptive knowledge augmentation for aspect-based sentiment analysis

期刊

出版社

ELSEVIER SCI LTD
DOI: 10.1016/j.ipm.2022.103260

关键词

Natural language understanding; Aspect-based sentiment analysis; Knowledge infusion mechanisms; Pre-trained language models; Model hot-plugging technique

向作者/读者索取更多资源

Aspect-based Sentiment Analysis (ABSA) is an important research field in natural language understanding (NLU), aiming to accurately recognize reviewers' opinions on different aspects of products and services. However, mainstream ABSA approaches rely heavily on large-scale supervised datasets, making it challenging to generalize high-quality sentiment analysis models. To address this, we propose a novel knowledge augmentation framework, called DictABSA, which leverages external background knowledge to enhance ABSA performance.
Aspect-based Sentiment Analysis (ABSA) is a crucial natural language understanding (NLU) research field which aims to accurately recognize reviewers' opinions on different aspects of products and services. Despite the prominence of recent ABSA applications, mainstream ABSA approaches inevitably rely on large-scale supervised corpora, and their final performances is susceptible to the quality of the training datasets. However, annotating sufficient data is labour intensive, which presents a significant barrier for generalizing a high-quality sentiment analysis model. Nonetheless, humans can make more accurate judgement based on their external background knowledge, such as factoid triples knowledge and event causality. Inspired by the investigations on external knowledge enhancement strategies in other popular NLP research, we propose a novel knowledge augmentation framework for ABSA, named the Oxford Dictionary descriptive knowledge-infused aspect-based sentiment analysis (DictABSA). Comprehensive experiments with many state-of-the-art approaches on several widely used benchmarks demonstrate that our proposed DictABSA significantly outperforms previous main-stream ABSA methods. For instance, compared with the baselines, our BERT-based knowledge infusion strategy achieves a substantial 6.42% and 5.26% absolute accuracy gain when adopting BERT-SPC on SemEval2014 and ABSA-DeBERTa on ACLShortData, respectively. Furthermore, to effectively make use of dictionary knowledge we devise several alternative knowledge infusion strategies. Extensive experiments using different knowledge infused strategies further demonstrate that the proposed knowledge infusion strategies effectively enhance the sentiment polarity identification capability. The Python implementation of our DictABSA is publicly available at https://github.com/albert-jin/DictionaryFused-E2E-ABSA.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据