4.7 Article

ASK-RoBERTa: A pretraining model for aspect-based sentiment classification via sentiment knowledge mining

期刊

KNOWLEDGE-BASED SYSTEMS
卷 253, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.knosys.2022.109511

关键词

Aspect-based sentiment classification; RoBERTa; Sentiment knowledge; Dependency grammar; Knowledge mining

资金

  1. Technology Innovation Special Program of Hubei Province [2022BAA044, 2021BAA188]
  2. Key Project of Science and Technology Research Program of Hubei Provincial Education Department [D20201006]
  3. National Natural Science Foundation of China [61977021]

向作者/读者索取更多资源

This paper introduces a sentiment knowledge-adaptive pretraining model (ASK-RoBERTa) that predicts sentiment polarities of different aspects by building a sentiment word dictionary and optimizing mining rules. The experimental results on multiple public benchmark datasets demonstrate the satisfactory performance of ASK-RoBERTa.
The main objective of aspect-based sentiment classification (ABSC) is to predict sentiment polarities of different aspects from sentences or documents. Recent research integrates sentiment terms into pretraining models whose accuracy impacts the ABSC performance. This paper introduces a sentiment knowledge-adaptive pretraining model (ASK-RoBERTa). A sentiment word dictionary is first built from general and field sentiment words. We develop a series of term and sentiment mining rules based on part-of-speech tagging and sentence dependency grammar. These mining rules consider word dependencies, compounding, and conjunctions. The pretraining model optimizes the mining rules to capture the dependency between aspects and sentiment words. Experimental results on multiple public benchmark datasets demonstrate the satisfactory performance of ASK-RoBERTa. (C) 2022 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据