Query focused abstractive summarization using BERTSUM model

Loading...
Thumbnail Image
Date
2020
Authors
Abdullah, Deen Mohammad
University of Lethbridge. Faculty of Arts and Science
Journal Title
Journal ISSN
Volume Title
Publisher
Lethbridge, Alta. : University of Lethbridge, Dept. of Mathematics and Computer Science
Abstract
In Natural Language Processing, researchers find many challenges on Query Focused Abstractive Summarization (QFAS), where Bidirectional Encoder Representations from Transformers for Summarization (BERTSUM) can be used for both extractive and abstractive summarization. As there is few available datasets for QFAS, we have generated queries for two publicly available datasets, CNN/Daily Mail and Newsroom, according to the context of the documents and summaries. To generate abstractive summaries, we have applied two different approaches, which are Query focused Abstractive and Query focused Extractive then Abstractive summarizations. In the first approach, we have sorted the sentences of the documents from the most query-related sentences to the less query-related sentences, and in the second approach, we have extracted only the query related sentences to fine-tune the BERTSUM model. Our experimental results show that both of our approaches show good results on ROUGE metric for CNN/Daily Mail and Newsroom datasets.
Description
Keywords
Natural language processing , Natural language generation (Computer science) , Text data mining , Software engineering , Systems engineering , Dissertations, Academic
Citation