Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Concept2Box: Joint Geometric Embeddings for Learning Two-View Knowledge Graphs (2307.01933v1)

Published 4 Jul 2023 in cs.AI, cs.CG, cs.CL, and cs.SC

Abstract: Knowledge graph embeddings (KGE) have been extensively studied to embed large-scale relational data for many real-world applications. Existing methods have long ignored the fact many KGs contain two fundamentally different views: high-level ontology-view concepts and fine-grained instance-view entities. They usually embed all nodes as vectors in one latent space. However, a single geometric representation fails to capture the structural differences between two views and lacks probabilistic semantics towards concepts' granularity. We propose Concept2Box, a novel approach that jointly embeds the two views of a KG using dual geometric representations. We model concepts with box embeddings, which learn the hierarchy structure and complex relations such as overlap and disjoint among them. Box volumes can be interpreted as concepts' granularity. Different from concepts, we model entities as vectors. To bridge the gap between concept box embeddings and entity vector embeddings, we propose a novel vector-to-box distance metric and learn both embeddings jointly. Experiments on both the public DBpedia KG and a newly-created industrial KG showed the effectiveness of Concept2Box.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (11)
  1. Zijie Huang (29 papers)
  2. Daheng Wang (5 papers)
  3. Binxuan Huang (21 papers)
  4. Chenwei Zhang (60 papers)
  5. Jingbo Shang (141 papers)
  6. Yan Liang (62 papers)
  7. Zhengyang Wang (48 papers)
  8. Xian Li (116 papers)
  9. Christos Faloutsos (88 papers)
  10. Yizhou Sun (149 papers)
  11. Wei Wang (1793 papers)
Citations (7)

Summary

We haven't generated a summary for this paper yet.