Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Simple but Effective Pluggable Entity Lookup Table for Pre-trained Language Models (2202.13392v3)

Published 27 Feb 2022 in cs.CL

Abstract: Pre-trained LLMs (PLMs) cannot well recall rich factual knowledge of entities exhibited in large-scale corpora, especially those rare entities. In this paper, we propose to build a simple but effective Pluggable Entity Lookup Table (PELT) on demand by aggregating the entity's output representations of multiple occurrences in the corpora. PELT can be compatibly plugged as inputs to infuse supplemental entity knowledge into PLMs. Compared to previous knowledge-enhanced PLMs, PELT only requires 0.2%-5% pre-computation with capability of acquiring knowledge from out-of-domain corpora for domain adaptation scenario. The experiments on knowledge-related tasks demonstrate that our method, PELT, can flexibly and effectively transfer entity knowledge from related corpora into PLMs with different architectures.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Deming Ye (10 papers)
  2. Yankai Lin (125 papers)
  3. Peng Li (390 papers)
  4. Maosong Sun (337 papers)
  5. Zhiyuan Liu (433 papers)
Citations (10)

Summary

We haven't generated a summary for this paper yet.