GATSum: Graph-Based Topic-Aware Abstract Text Summarization

Authors

  • Ming Jiang
  • Yifan Zou Hangzhou Dianzi University
  • Jian Xu
  • Min Zhang

DOI:

https://doi.org/10.5755/j01.itc.51.2.30796

Keywords:

Text Summarization, Abstract, Neural topic model, BERT, Graph attention network

Abstract

The purpose of text summarization is to compress a text document into a summary containing key information. abstract approaches are challenging tasks, it is necessary to design a mechanism to effectively extract salient information from the source text, and then generate a summary. However, most of the existing abstract approaches are difficult to capture global semantics, ignoring the impact of global information on obtaining important content. To solve this problem, this paper proposes a Graph-Based Topic Aware abstract Text Summarization (GTASum) framework. Specifically, GTASum seamlessly incorporates a neural topic model to discover potential topic information, which can provide document-level features for generating summaries. In addition, the model integrates the graph neural network which can effectively capture the relationship between sentences through the document representation of graph structure, and simultaneously update the local and global information. The further discussion showed that latent topics can help the model capture salient content. We conducted experiments on two datasets, and the result shows that GTASum is superior to many extractive and abstract approaches in terms of ROUGE measurement. The result of the ablation study proves that the model has the ability to capture the original subject and the correct information and improve the factual accuracy of the summarization.

Downloads

Published

2022-06-23

Issue

Section

Articles