Query-focused abstractive summarization using sequence-to-sequence and transformer models
Loading...
Date
2019
Authors
Polash, Md Mainul Hasan
University of Lethbridge. Faculty of Arts and Science
Journal Title
Journal ISSN
Volume Title
Publisher
Lethbridge, Alta. : University of Lethbridge, Dept. of Mathematics and Computer Science
Abstract
Query Focused Summarization (QFS) summarizes a long document with respect to a given input query. Creating a query-focused abstractive summary by using a neural network model is a difficult task which is yet to be fully solved. In our thesis, we propose two neural network models for the query-focused abstractive summarization task. We propose a model based on the sequence-to-sequence architecture with a pointer-generator mechanism. Furthermore, we also use the transformer architecture to design a model for the abstractive summarization. Afterward, we train both our models with the Debatepedia dataset so that the model can learn to summarize a long document with respect to a query. We evaluate the output of our models against the human-created reference summary. Our transformer model outperforms our sequence-to-sequence model in all ROUGE scores.
Description
Keywords
Natural language processing , Neural computers , Software engineering , Systems engineering , Text data mining