Improving faithfulness in abstractive text summarization with EDUs using BART

No Thumbnail Available
Date
2023-04-27
Authors
Delpisheh, Narjes
University of Lethbridge. Faculty of Arts and Science
Journal Title
Journal ISSN
Volume Title
Publisher
Lethbridge, Alta. : University of Lethbridge, Dept. of Mathematics and Computer Science
Abstract
Abstractive summarization aims to reproduce the essential information of a source document in a summary by using the summarizer's own words. Although this approach is more similar to how humans summarize, it is more challenging to automate as it requires a complete understanding of natural language. However, the development of deep learning approaches, such as the sequence-to-sequence model with an attention-based mechanism, and the availability of pre-trained language models have led to improved performance in summarization systems. Nonetheless, abstractive summarization still suffers from issues such as hallucination and unfaithfulness. To address these issues, we propose an approach that utilizes a guidance signal using important Elementary Discourse Units (EDUs). We compare our work with previous guided summarization and two other summarization models that enhanced the faithfulness of the summary. Our approach was tested on CNN/Daily Mail dataset, and results showed an improvement in both truthfulness and good quantity coverage of the source document.
Description
Keywords
Faithfulness , Abstractive summarization , Elementary discourse units , EDU , Hallucination , Text summarization , Natural language
Citation