Emergent Mind

Distantly Supervised Relation Extraction in Federated Settings

(2008.05049)
Published Aug 12, 2020 in cs.CL and cs.LG

Abstract

This paper investigates distantly supervised relation extraction in federated settings. Previous studies focus on distant supervision under the assumption of centralized training, which requires collecting texts from different platforms and storing them on one machine. However, centralized training is challenged by two issues, namely, data barriers and privacy protection, which make it almost impossible or cost-prohibitive to centralize data from multiple platforms. Therefore, it is worthy to investigate distant supervision in the federated learning paradigm, which decouples the model training from the need for direct access to the raw data. Overcoming label noise of distant supervision, however, becomes more difficult in federated settings, since the sentences containing the same entity pair may scatter around different platforms. In this paper, we propose a federated denoising framework to suppress label noise in federated settings. The core of this framework is a multiple instance learning based denoising method that is able to select reliable instances via cross-platform collaboration. Various experimental results on New York Times dataset and miRNA gene regulation relation dataset demonstrate the effectiveness of the proposed method.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.