Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 28 tok/s Pro
GPT-5 High 22 tok/s Pro
GPT-4o 72 tok/s Pro
Kimi K2 211 tok/s Pro
GPT OSS 120B 438 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Recurrent Event Network: Autoregressive Structure Inference over Temporal Knowledge Graphs (1904.05530v4)

Published 11 Apr 2019 in cs.LG, cs.AI, cs.CL, and stat.ML

Abstract: Knowledge graph reasoning is a critical task in natural language processing. The task becomes more challenging on temporal knowledge graphs, where each fact is associated with a timestamp. Most existing methods focus on reasoning at past timestamps and they are not able to predict facts happening in the future. This paper proposes Recurrent Event Network (RE-NET), a novel autoregressive architecture for predicting future interactions. The occurrence of a fact (event) is modeled as a probability distribution conditioned on temporal sequences of past knowledge graphs. Specifically, our RE-NET employs a recurrent event encoder to encode past facts and uses a neighborhood aggregator to model the connection of facts at the same timestamp. Future facts can then be inferred in a sequential manner based on the two modules. We evaluate our proposed method via link prediction at future times on five public datasets. Through extensive experiments, we demonstrate the strength of RENET, especially on multi-step inference over future timestamps, and achieve state-of-the-art performance on all five datasets. Code and data can be found at https://github.com/INK-USC/RE-Net.

Citations (57)

Summary

  • The paper introduces RE-Net, an autoregressive model that predicts future events in temporal knowledge graphs by encoding past event sequences.
  • It features a recurrent event encoder and neighborhood aggregator that effectively capture temporal and structural dependencies.
  • Experiments on five datasets demonstrate state-of-the-art performance in multi-step temporal link prediction using metrics like MRR and Hit@10.

Recurrent Event Network: Autoregressive Structure Inference over Temporal Knowledge Graphs

The paper "Recurrent Event Network: Autoregressive Structure Inference over Temporal Knowledge Graphs" introduces a novel method for reasoning over temporal knowledge graphs (TKGs). Unlike traditional knowledge graphs, TKGs associate each fact with a timestamp, adding complexity to the task of knowledge graph reasoning. This paper makes significant strides in predicting future events in TKGs, a capability that has been underexplored.

Key Contributions

  1. Introduction of RE-Net: The proposed Recurrent Event Network (RE-Net) utilizes an autoregressive framework to predict future facts in TKGs. The model intelligently encodes past events and integrations within the graph, formulating future predictions based on learned patterns.
  2. Recurrent Event Encoder and Neighborhood Aggregator: RE-Net’s architecture includes a recurrent event encoder complemented by a neighborhood aggregator. These components allow for the encoding of past sequences and the modeling of concurrent events, capturing temporal and structural dependencies effectively.
  3. Sequential Inference: The model supports multi-step inference over new timestamps. It can predict events without requiring ground truths of preceding events, a feature distinct from existing models like Know-Evolve and DyRep.

Experimental Evaluation

The authors evaluated RE-Net using five datasets, focusing on temporal link prediction tasks. These datasets include varied data sources, such as ICEWS and GDELT. The model consistently achieved state-of-the-art performance, even in multi-step inference scenarios, significantly outperforming existing methods in terms of Mean Reciprocal Rank and Hit@10 metrics.

Implications and Future Directions

The introduction of RE-Net offers substantial implications both practically and theoretically. Practically, its ability to model temporal dynamics in knowledge graphs supports applications across numerous domains, from event prediction to complex temporal reasoning tasks. Theoretically, RE-Net’s autoregressive approach provides a framework for exploring more sophisticated temporal and relational dependencies.

Looking forward, further exploration could involve improving computational efficiency and extending the model to handle lasting events beyond discrete occurrences. The modularity of the RE-Net architecture also provides opportunities for integrating additional data forms or hybrid approaches, potentially expanding its applicability in diverse AI contexts.

Conclusion

This paper presents a significant advancement in the field of temporal knowledge graph reasoning by introducing RE-Net. Through leveraging autoregressive principles and a robust architectural framework, RE-Net sets a new benchmark for the prediction of future interactions in temporal knowledge graphs.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Github Logo Streamline Icon: https://streamlinehq.com