Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 77 tok/s
Gemini 2.5 Pro 33 tok/s Pro
GPT-5 Medium 25 tok/s Pro
GPT-5 High 27 tok/s Pro
GPT-4o 75 tok/s Pro
Kimi K2 220 tok/s Pro
GPT OSS 120B 465 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

CoRank: A clustering cum graph ranking approach for extractive summarization (2106.00619v1)

Published 1 Jun 2021 in cs.SI

Abstract: Online information has increased tremendously in today's age of Internet. As a result, the need has arose to extract relevant content from the plethora of available information. Researchers are widely using automatic text summarization techniques for extracting useful and relevant information from voluminous available information, it also enables users to obtain valuable knowledge in a limited period of time with minimal effort. The summary obtained from the automatic text summarization often faces the issues of diversity and information coverage. Promising results are obtained for automatic text summarization by the introduction of new techniques based on graph ranking of sentences, clustering, and optimization. This research work proposes CoRank, a two-stage sentence selection model involving clustering and then ranking of sentences. The initial stage involves clustering of sentences using a novel clustering algorithm, and later selection of salient sentences using CoRank algorithm. The approach aims to cover two objectives: maximum coverage and diversity, which is achieved by the extraction of main topics and sub-topics from the original text. The performance of the CoRank is validated on DUC2001 and DUC 2002 data sets.

Citations (1)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.