Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 39 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 12 tok/s Pro
GPT-5 High 18 tok/s Pro
GPT-4o 91 tok/s Pro
Kimi K2 191 tok/s Pro
GPT OSS 120B 456 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

ListReader: Extracting List-form Answers for Opinion Questions (2110.11692v1)

Published 22 Oct 2021 in cs.CL

Abstract: Question answering (QA) is a high-level ability of natural language processing. Most extractive ma-chine reading comprehension models focus on factoid questions (e.g., who, when, where) and restrict the output answer as a short and continuous span in the original passage. However, in real-world scenarios, many questions are non-factoid (e.g., how, why) and their answers are organized in the list format that contains multiple non-contiguous spans. Naturally, existing extractive models are by design unable to answer such questions. To address this issue, this paper proposes ListReader, a neural ex-tractive QA model for list-form answer. In addition to learning the alignment between the question and content, we introduce a heterogeneous graph neural network to explicitly capture the associations among candidate segments. Moreover, our model adopts a co-extraction setting that can extract either span- or sentence-level answers, allowing better applicability. Two large-scale datasets of different languages are constructed to support this study. Experimental results show that our model considerably outperforms various strong baselines. Further discussions provide an intuitive understanding of how our model works and where the performance gain comes from.

Citations (2)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (3)