Papers
Topics
Authors
Recent
2000 character limit reached

Knowledge Graph Embedding with Entity Neighbors and Deep Memory Network (1808.03752v1)

Published 11 Aug 2018 in cs.CL and cs.LG

Abstract: Knowledge Graph Embedding (KGE) aims to represent entities and relations of knowledge graph in a low-dimensional continuous vector space. Recent works focus on incorporating structural knowledge with additional information, such as entity descriptions, relation paths and so on. However, common used additional information usually contains plenty of noise, which makes it hard to learn valuable representation. In this paper, we propose a new kind of additional information, called entity neighbors, which contain both semantic and topological features about given entity. We then develop a deep memory network model to encode information from neighbors. Employing a gating mechanism, representations of structure and neighbors are integrated into a joint representation. The experimental results show that our model outperforms existing KGE methods utilizing entity descriptions and achieves state-of-the-art metrics on 4 datasets.

Citations (18)

Summary

  • The paper presents a novel approach using entity neighbors and deep memory networks to improve knowledge graph embeddings.
  • It employs dual extraction of topological and semantic neighbors to enhance representations in sparse data conditions.
  • Experimental evaluations on FB15k, FB15k237, WN18, and WN18RR show superior link prediction performance over traditional KGE models.

Knowledge Graph Embedding with Entity Neighbors and Deep Memory Network

Introduction to Knowledge Graph Embedding

Knowledge Graph Embedding (KGE) is a pivotal technique in artificial intelligence, translating the rich and complex structure of knowledge graphs (KGs) into low-dimensional continuous vector spaces. This transformation facilitates reasoning and predictions by leveraging algebraic computations. Traditional methods, such as TransE, suffer when handling entities with minimal relational facts, as these methods rely solely on triplet data. Hence, recent approaches have incorporated additional information sources like entity descriptions and relation paths, which add semantic and topological features necessary for comprehensive entity representation. Figure 1

Figure 1: Example of entity descriptions and relation paths for a triplet in Freebase.

Entity Neighbors as Additional Information

The paper introduces "entity neighbors" as a novel kind of auxiliary data to improve KGE. Entity neighbors encapsulate both topological and semantic information about an entity. Topological neighbors are derived from the entity's connections within the KG, while semantic neighbors are extracted from text descriptions and include entities that mention the specific entity or are mentioned within its description. This dual approach ensures semantic richness and contextual simplicity, mitigating noise typically encountered in verbose textual descriptions. The proposed method optimizes for common scenarios where data might be sparse or descriptions are missing.

NKGE Model and Deep Memory Network Encoder

The Neighborhood Knowledge Graph Embedding (NKGE) model utilizes a cutting-edge architecture combining TransE and ConvE based embeddings with a deep memory network (DMN) encoder. This is an innovative application in KG embeddings, leveraging memory networks to process and encode entity neighbors. The DMN encoder employs a multilayered architecture that iteratively refines neighbor representations through attention mechanisms, with a remarkable capability to generalize semantic relations and achieve abstract comprehension. Figure 2

Figure 2: An illustration of DMN encoder with three layers.

The NKGE model integrates the structured entity representation with neighbor representations using a gating mechanism, allowing for a seamless and dynamic blend of information sources. This adaptation makes it possible to retain the strengths of both structural and neighbor-derived features, resulting in superior embeddings that can effectively address the sparsity and noise issues prevalent in KG data. Figure 3

Figure 3: The general architecture of NKGE model.

Experimental Evaluation and Results

The NKGE model has been evaluated against prominent datasets FB15k, FB15k237, WN18, and WN18RR, all vital benchmarks in the KG domain. Utilizing metrics such as Mean Rank (MR), Mean Reciprocal Rank (MRR), and Hits@10, the results demonstrate substantial enhancements in link prediction tasks. The NKGE (TransE) and NKGE (ConvE) variants notably outperform established baselines, surpassing models like ComplEx and ANALOGY in several metrics and attaining new benchmarks for performance in link prediction tasks. Figure 4

Figure 4: The quantitative distribution of different types of neighbors on FB15k237.

An additional trial contrasting entity neighbors with entity descriptions substantiates the proposed model's effectiveness. NKGE, leveraging entity neighbors and DMN encoders, consistently exhibits improved rank performance over traditional, description-dependent methods.

Implications and Future Directions

This research underscores the viability of employing entity neighbors and memory networks to enhance KGE, with particular efficacy in scenarios plagued by sparsity and noise. The approach is particularly valuable in domains where acquiring extensive relational data is challenging. Future exploration includes refining neighbor selection mechanisms and enhancing gating processes by incorporating relation-specific data. Moreover, NKGE's potential for sparse KG completion is promising, suggesting a range of applications across various real-world KGs.

In conclusion, the experiment results highlight both the conceptual rigor and practical efficacy of the NKGE model, offering a substantial advance in the scalability and accuracy of knowledge graph embeddings. Future developments will likely further integrate enriched semantic data, thereby enhancing AI applications reliant on robust reasoning capacities.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Collections

Sign up for free to add this paper to one or more collections.