4.2 Article

Multi-layered attentional peephole convolutional LSTM for abstractive text summarization

期刊

ETRI JOURNAL
卷 43, 期 2, 页码 288-298

出版社

WILEY
DOI: 10.4218/etrij.2019-0016

关键词

abstractive text summarization; convolutional long short‐ term memory; deep neural network; long short‐ term memory; sequence to sequence modeling

向作者/读者索取更多资源

This study introduces a summary generation model based on multilayered attentional peephole convolutional LSTM, which improves the quality of summaries by giving weights to important parts of the source text during training. Evaluations show that this model outperforms traditional LSTM-based models in terms of semantic coherence.
ive text summarization is a process of making a summary of a given text by paraphrasing the facts of the text while keeping the meaning intact. The manmade summary generation process is laborious and time-consuming. We present here a summary generation model that is based on multilayered attentional peephole convolutional long short-term memory (MAPCoL; LSTM) in order to extract abstractive summaries of large text in an automated manner. We added the concept of attention in a peephole convolutional LSTM to improve the overall quality of a summary by giving weights to important parts of the source text during training. We evaluated the performance with regard to semantic coherence of our MAPCoL model over a popular dataset named CNN/Daily Mail, and found that MAPCoL outperformed other traditional LSTM-based models. We found improvements in the performance of MAPCoL in different internal settings when compared to state-of-the-art models of abstractive text summarization.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.2
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据