Papers
Topics
Authors
Recent
2000 character limit reached

A Multi-Size Neural Network with Attention Mechanism for Answer Selection

Published 24 Apr 2021 in cs.CL and cs.AI | (2105.03278v1)

Abstract: Semantic matching is of central significance to the answer selection task which aims to select correct answers for a given question from a candidate answer pool. A useful method is to employ neural networks with attention to generate sentences representations in a way that information from pair sentences can mutually influence the computation of representations. In this work, an effective architecture,multi-size neural network with attention mechanism (AM-MSNN),is introduced into the answer selection task. This architecture captures more levels of language granularities in parallel, because of the various sizes of filters comparing with single-layer CNN and multi-layer CNNs. Meanwhile it extends the sentence representations by attention mechanism, thus containing more information for different types of questions. The empirical study on three various benchmark tasks of answer selection demonstrates the efficacy of the proposed model in all the benchmarks and its superiority over competitors. The experimental results show that (1) multi-size neural network (MSNN) is a more useful method to capture abstract features on different levels of granularities than single/multi-layer CNNs; (2) the attention mechanism (AM) is a better strategy to derive more informative representations; (3) AM-MSNN is a better architecture for the answer selection task for the moment.

Citations (2)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.