Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Tensor Decompositions for temporal knowledge base completion (2004.04926v1)

Published 10 Apr 2020 in stat.ML and cs.LG

Abstract: Most algorithms for representation learning and link prediction in relational data have been designed for static data. However, the data they are applied to usually evolves with time, such as friend graphs in social networks or user interactions with items in recommender systems. This is also the case for knowledge bases, which contain facts such as (US, has president, B. Obama, [2009-2017]) that are valid only at certain points in time. For the problem of link prediction under temporal constraints, i.e., answering queries such as (US, has president, ?, 2012), we propose a solution inspired by the canonical decomposition of tensors of order 4. We introduce new regularization schemes and present an extension of ComplEx (Trouillon et al., 2016) that achieves state-of-the-art performance. Additionally, we propose a new dataset for knowledge base completion constructed from Wikidata, larger than previous benchmarks by an order of magnitude, as a new reference for evaluating temporal and non-temporal link prediction methods.

Citations (228)

Summary

  • The paper extends ComplEx by incorporating a timestamp factor into order-4 tensor decomposition, enhancing temporal link prediction.
  • It introduces novel temporal regularization techniques that enforce smooth embedding evolution to capture dynamic data.
  • Evaluated on datasets like ICEWS and Yago, the model significantly improves Mean Reciprocal Rank over state-of-the-art methods.

Tensor Decompositions for Temporal Knowledge Base Completion

This paper discusses advancements in the field of temporal knowledge base completion through the use of tensor decompositions, specifically focusing on the application of order-4 tensor methods. Traditional algorithms for representation learning and link prediction are typically designed for static relational data; however, the authors address the dynamic nature of real-world data, such as social network graphs and annual state-level changes in knowledge bases. This paper proposes a novel approach using the canonical decomposition of tensors to improve temporal link prediction performance.

Key Contributions

  1. Extension of ComplEx for Temporal Data: The paper extends the ComplEx model, which has been effective in static knowledge base completion, to handle temporal data by integrating a new timestamp factor into the tensor decomposition. This is a logical extension of ComplEx's ability to model time-dependent relations by treating timestamps as an additional tensor mode.
  2. Regularization Schemes: Novel regularization approaches are introduced in the model to cope with temporal data's inherent complexities. Particularly, the authors leverage a temporal smoothness prior to ensure the embeddings' slow evolution over time. This is crucial for accurately capturing the data's temporal dynamics.
  3. Handling Non-Temporal Components: The proposed TNTComplEx model accounts for both temporal and non-temporal predicate-components within heterogeneous datasets. This dual approach effectively distinguishes between data that are influenced by temporal changes and those that remain constant over time, thereby enhancing prediction accuracy.
  4. Development of a Comprehensive Dataset: A large-scale dataset derived from Wikidata with temporal annotations is constructed, aiming to serve as a benchmark for temporal and non-temporal link prediction methods. This dataset presents an order of magnitude larger than antecedent benchmarks and includes significant temporal annotations, facilitating comprehensive evaluations.

Experimental Results and Analysis

Experiments on established datasets like ICEWS and Yago demonstrate that the proposed models outperform state-of-the-art methods in temporal knowledge base completion. The models exhibit improved performances regarding Mean Reciprocal Rank (MRR) for temporal link prediction, with substantial gains achieved due to the new temporal regularization techniques introduced.

The paper provides a thorough comparison of the proposed models against existing approaches, such as DE-SimplE and traditional ComplEx, highlighting the benefits of integrating temporal factors and tailored regularization strategies in tensor-based knowledge completion tasks.

Implications and Future Directions

The methodology presented poses significant implications for real-world applications where knowledge evolves with time, such as in recommender systems, dynamic social networks, and other systems that rely heavily on temporal datasets. By efficiently embedding temporal information within tensor decompositions, these models offer robust and scalable solutions for temporal knowledge base completion tasks.

The authors suggest several future directions for this line of research. One notable path is the exploration of additional temporal regularizers and the examination of their impacts on accuracy. Moreover, further scaling of these models to handle increasingly larger datasets while maintaining prediction quality presents an ongoing challenge that invites continued exploration.

In conclusion, the paper presents a significant advancement in dynamic knowledge base completion, illustrating that the incorporation of temporal factors through tensor decompositions can substantially enhance predictive performances in time-sensitive applications. Future work should focus on optimizing these methods for even larger scales and exploring the integration of other dynamic relational patterns beyond simple temporal structures.

Github Logo Streamline Icon: https://streamlinehq.com