Download PDFOpen PDF in browser

An Abstractive Summarizer Based on Improved Pointer-Generator Network

EasyChair Preprint no. 1067

6 pagesDate: May 30, 2019

Abstract

Aiming at the problems of insufficient semantic understanding, fluency and accuracy of abstracts in the field of neural abstractive summarization, an automatic text summarization model is proposed. First, we introduce the decoder attention mechanism in the reference network, which effectively improves the ability to understand words and generate vocabulary words. Second, the ability to extract words from the original text is improved by using the multi-hop attention mechanism, which improves the ability of the model to process out-of-vocabulary words. The experimental results on the CNN/Daily Mail dataset show that the model performs well on the standard evaluation system and improves the summary accuracy and sentence fluency.

Keyphrases: Abstractive Summarization, Attention Mechanism, pointer-generator network, Recurrent Neural Network

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:1067,
  author = {Wenbo Nie and Wei Zhang and Xinle Li and Yao Yu},
  title = {An Abstractive Summarizer Based on Improved Pointer-Generator Network},
  howpublished = {EasyChair Preprint no. 1067},

  year = {EasyChair, 2019}}
Download PDFOpen PDF in browser