4.5 Article

A comparative analysis of transformer based models for figurative language classification

期刊

COMPUTERS & ELECTRICAL ENGINEERING
卷 101, 期 -, 页码 -

出版社

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.compeleceng.2022.108051

关键词

Figurative language; Sentiment analysis; Sarcasm Natural language processing; Long Short Term Memory (LSTM); Transformers Architecture Fine tuning; Hyperbole Rhetorical questions

向作者/读者索取更多资源

This study explores several models based on the Transformer architecture and analyzes their performance, with a focus on the adaptability of Transformers for figurative language classification. The results highlight the importance of computers understanding figurative language and the current challenges and research progress in this field.
Efficient and effective methods are required to construct a model to rapidly extractdifferent sentiments from large volumes of text. To augment the performance of the models, contemporary developments in Natural Language Processing (NLP) have been utilized by researchers to work on several model architecture and pretraining tasks. This work explores several models based on transformer architecture and analyses its performance. In this work, the researchersusea dataset to answer the question of whether or not transformers work significantly well for figurative language and not just literal language classification. The results of various models are compared and have come up as a result of research over time. The study explains why it is necessary for computers to understand the occurrence of figurative language, why it is yet a challenge and is being intensively worked on to date, and how it is different from literal language classification. This research also covers how well these models train on a specific type of figurative language and generalize on a few other similar types.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据