Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Zero-shot Sequence Labeling: Transferring Knowledge from Sentences to Tokens (1805.02214v1)

Published 6 May 2018 in cs.CL, cs.LG, and cs.NE

Abstract: Can attention- or gradient-based visualization techniques be used to infer token-level labels for binary sequence tagging problems, using networks trained only on sentence-level labels? We construct a neural network architecture based on soft attention, train it as a binary sentence classifier and evaluate against token-level annotation on four different datasets. Inferring token labels from a network provides a method for quantitatively evaluating what the model is learning, along with generating useful feedback in assistance systems. Our results indicate that attention-based methods are able to predict token-level labels more accurately, compared to gradient-based methods, sometimes even rivaling the supervised oracle network.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Marek Rei (52 papers)
  2. Anders Søgaard (122 papers)
Citations (53)

Summary

We haven't generated a summary for this paper yet.