Emergent Mind

Abstract

Few-shot relation extraction aims to recognize novel relations with few labeled sentences in each relation. Previous metric-based few-shot relation extraction algorithms identify relationships by comparing the prototypes generated by the few labeled sentences embedding with the embeddings of the query sentences using a trained metric function. However, as these domains always have considerable differences from those in the training dataset, the generalization ability of these approaches on unseen relations in many domains is limited. Since the prototype is necessary for obtaining relationships between entities in the latent space, we suggest learning more interpretable and efficient prototypes from prior knowledge and the intrinsic semantics of relations to extract new relations in various domains more effectively. By exploring the relationships between relations using prior information, we effectively improve the prototype representation of relations. By using contrastive learning to make the classification margins between sentence embedding more distinct, the prototype's geometric interpretability is enhanced. Additionally, utilizing a transfer learning approach for the cross-domain problem allows the generation process of the prototype to account for the gap between other domains, making the prototype more robust and enabling the better extraction of associations across multiple domains. The experiment results on the benchmark FewRel dataset demonstrate the advantages of the suggested method over some state-of-the-art approaches.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.