Improving faithfulness in abstractive text summarization with EDUs using BART
No Thumbnail Available
University of Lethbridge. Faculty of Arts and Science
Lethbridge, Alta. : University of Lethbridge, Dept. of Mathematics and Computer Science
Abstractive summarization aims to reproduce the essential information of a source document in a summary by using the summarizer's own words. Although this approach is more similar to how humans summarize, it is more challenging to automate as it requires a complete understanding of natural language. However, the development of deep learning approaches, such as the sequence-to-sequence model with an attention-based mechanism, and the availability of pre-trained language models have led to improved performance in summarization systems. Nonetheless, abstractive summarization still suffers from issues such as hallucination and unfaithfulness. To address these issues, we propose an approach that utilizes a guidance signal using important Elementary Discourse Units (EDUs). We compare our work with previous guided summarization and two other summarization models that enhanced the faithfulness of the summary. Our approach was tested on CNN/Daily Mail dataset, and results showed an improvement in both truthfulness and good quantity coverage of the source document.
Faithfulness , Abstractive summarization , Elementary discourse units , EDU , Hallucination , Text summarization , Natural language