4.6 Article

Somun: entity-centric summarization incorporating pre-trained language models

期刊

NEURAL COMPUTING & APPLICATIONS
卷 33, 期 10, 页码 5301-5311

出版社

SPRINGER LONDON LTD
DOI: 10.1007/s00521-020-05319-2

关键词

Automatic text summarization; Language models; Harmonic centrality

向作者/读者索取更多资源

This study introduces an entity-centric summarization method that extracts named entities using a dependency parser and generates a small graph, ranking entities using the harmonic centrality algorithm. Experimental results show that it outperforms state-of-the-art unsupervised learning baselines by more than 10% for ROUGE-1 and more than 50% for ROUGE-2 scores, achieving comparable results to recent end-to-end models.
Text summarization resolves the issue of capturing essential information from a large volume of text data. Existing methods either depend on the end-to-end models or hand-crafted preprocessing steps. In this study, we propose an entity-centric summarization method which extracts named entities and produces a small graph with a dependency parser. To extract entities, we employ well-known pre-trained language models. After generating the graph, we perform the summarization by ranking entities using the harmonic centrality algorithm. Experiments illustrate that we outperform the state-of-the-art unsupervised learning baselines by improving the performance more than 10% for ROUGE-1 and more than 50% for ROUGE-2 scores. Moreover, we achieve comparable results to recent end-to-end models.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据