Abstractive text summarization based on neural fusion
No Thumbnail Available
University of Lethbride. Faculty of Arts and Science
Lethbridge, Alta. : University of Lethbridge, Dept. of Mathematics and Computer Science
Abstractive text summarization, in comparison to extractive text summarization, offers the potential to generate more accurate summaries. In our work, we present a stage-wise abstractive text summarization model that incorporates Elementary Discourse Unit (EDU) segmentation, EDU selection, and EDU fusion. We ﬁrst segment the articles into a ﬁne-grained form, EDUs, and build a Rhetorical Structure Theory (RST) tree for each article in order to represent the dependencies among EDUs; those EDUs are encoded in Graph Attention Networks (GATs); those with higher importance will be selected as candidates to be fused and the fusing stage is done by Bidirectional and Auto-Regressive Transformers (BART) model which merges the selected EDUs into summaries. A Greedy Method is leveraged to greedily select those EDUs whose combinations can maximize the ROUGE scores. Our model outperforms the baseline of BART (large) on the CNN/Daily Mail dataset, showing its effectiveness in abstractive text summarization.
Abstractive text summarization , Elementary discourse unit , Rhetorical structure theory , Neural networks , Neural fusion