Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Salience Allocation as Guidance for Abstractive Summarization (2210.12330v1)

Published 22 Oct 2022 in cs.CL, cs.AI, and cs.LG

Abstract: Abstractive summarization models typically learn to capture the salient information from scratch implicitly. Recent literature adds extractive summaries as guidance for abstractive summarization models to provide hints of salient content and achieves better performance. However, extractive summaries as guidance could be over strict, leading to information loss or noisy signals. Furthermore, it cannot easily adapt to documents with various abstractiveness. As the number and allocation of salience content pieces vary, it is hard to find a fixed threshold deciding which content should be included in the guidance. In this paper, we propose a novel summarization approach with a flexible and reliable salience guidance, namely SEASON (SaliencE Allocation as Guidance for Abstractive SummarizatiON). SEASON utilizes the allocation of salience expectation to guide abstractive summarization and adapts well to articles in different abstractiveness. Automatic and human evaluations on two benchmark datasets show that the proposed method is effective and reliable. Empirical results on more than one million news articles demonstrate a natural fifteen-fifty salience split for news article sentences, providing a useful insight for composing news articles.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Fei Wang (574 papers)
  2. Kaiqiang Song (32 papers)
  3. Hongming Zhang (111 papers)
  4. Lifeng Jin (24 papers)
  5. Sangwoo Cho (22 papers)
  6. Wenlin Yao (38 papers)
  7. Xiaoyang Wang (134 papers)
  8. Muhao Chen (159 papers)
  9. Dong Yu (329 papers)
Citations (27)

Summary

We haven't generated a summary for this paper yet.