Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 49 tok/s
Gemini 2.5 Pro 53 tok/s Pro
GPT-5 Medium 19 tok/s Pro
GPT-5 High 16 tok/s Pro
GPT-4o 103 tok/s Pro
Kimi K2 172 tok/s Pro
GPT OSS 120B 472 tok/s Pro
Claude Sonnet 4 39 tok/s Pro
2000 character limit reached

Enhancing Asynchronous Time Series Forecasting with Contrastive Relational Inference (2309.02868v2)

Published 6 Sep 2023 in cs.LG

Abstract: Asynchronous time series, also known as temporal event sequences, are the basis of many applications throughout different industries. Temporal point processes(TPPs) are the standard method for modeling such data. Existing TPP models have focused on parameterizing the conditional distribution of future events instead of explicitly modeling event interactions, imposing challenges for event predictions. In this paper, we propose a novel approach that leverages Neural Relational Inference (NRI) to learn a relation graph that infers interactions while simultaneously learning the dynamics patterns from observational data. Our approach, the Contrastive Relational Inference-based Hawkes Process (CRIHP), reasons about event interactions under a variational inference framework. It utilizes intensity-based learning to search for prototype paths to contrast relationship constraints. Extensive experiments on three real-world datasets demonstrate the effectiveness of our model in capturing event interactions for event sequence modeling tasks. Code will be integrated into the EasyTPP framework.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (32)
  1. H. Mei and J. Eisner, “The neural hawkes process: A neurally self-modulating multivariate point process,” in NeurIPS, 2017.
  2. S. Xue, X. Shi, J. Zhang, and H. Mei, “HYPRO: A hybridly normalized probabilistic model for long-horizon prediction of event sequences,” in NeurIPS, 2022.
  3. O. Shchur, A. C. Türkmen, T. Januschowski, and S. Günnemann, “Neural temporal point processes: A review,” in IJCAI 2021, 2021.
  4. S. Xue, X. Shi, Z. Chu, Y. Wang, F. Zhou, H. Hao, C. Jiang, C. Pan, Y. Xu, J. Y. Zhang, Q. Wen, J. Zhou, and H. Mei, “Easytpp: Towards open benchmarking the temporal point processes,” CoRR, vol. abs/2307.08097, 2023.
  5. N. Du, H. Dai, R. Trivedi, U. Upadhyay, M. Gomez-Rodriguez, and L. Song, “Recurrent marked temporal point processes: Embedding event history to vector,” in SIGKDD, 2016.
  6. S. Xiao, J. Yan, M. Farajtabar, L. Song, X. Yang, and H. Zha, “Learning time series associated event sequences with recurrent point process networks,” IEEE Trans. Neural Networks Learn. Syst., 2019.
  7. Q. Zhang, A. Lipani, O. Kirnap, and E. Yilmaz, “Self-attentive hawkes process,” in International conference on machine learning, 2020.
  8. S. Zuo, H. Jiang, Z. Li, T. Zhao, and H. Zha, “Transformer hawkes process,” in ICML, 2020.
  9. H. Xu, M. Farajtabar, and H. Zha, “Learning granger causality for hawkes processes,” in ICML, 2016.
  10. M. Achab, E. Bacry, S. Gaïffas, I. Mastromatteo, and J. Muzy, “Uncovering causality from multivariate hawkes integrated cumulants,” in ICML, 2017.
  11. M. Eichler, R. Dahlhaus, and J. Dueck, “Graphical modeling for multivariate hawkes processes with nonparametric link functions,” Journal of Time Series Analysis, 2017.
  12. Z. Chu, J. Huang, R. Li, W. Chu, and S. Li, “Causal effect estimation: Recent advances, challenges, and opportunities,” arXiv preprint arXiv:2302.00848, 2023.
  13. Z. Chu, S. L. Rathbun, and S. Li, “Matching in selective and balanced representation space for treatment effects estimation,” in CIKM, 2020.
  14. W. Wu, H. Liu, X. Zhang, Y. Liu, and H. Zha, “Modeling event propagation via graph biased temporal point process,” IEEE Transactions on Neural Networks and Learning Systems, 2020.
  15. J. Shang and M. Sun, “Geometric hawkes processes with graph convolutional recurrent neural networks,” in AAAI, 2019.
  16. S. Xue, X. Shi, H. Hao, L. Ma, J. Zhang, S. Wang, and S. Wang, “A graph regularized point process model for event propagation sequence,” in IJCNN, 2021.
  17. T. Li, T. Luo, Y. Ke, and S. J. Pan, “Mitigating performance saturation in neural marked point processes: Architectures and loss functions,” in KDD, 2021.
  18. Z. Chu, S. L. Rathbun, and S. Li, “Graph infomax adversarial learning for treatment effect estimation with networked observational data,” in SIGKDD, 2021.
  19. Z. Chu, H. Hao, X. Ouyang, S. Wang, Y. Wang, Y. Shen, J. Gu, Q. Cui, L. Li, S. Xue et al., “Leveraging large language models for pre-trained recommender systems,” arXiv preprint arXiv:2308.10837, 2023.
  20. Y. Wang, Z. Chu, X. Ouyang, S. Wang, H. Hao, Y. Shen, J. Gu, S. Xue, J. Y. Zhang, Q. Cui et al., “Enhancing recommender systems with large language model reasoning graphs,” arXiv preprint arXiv:2308.10835, 2023.
  21. H. Mei, C. Yang, and J. Eisner, “Transformer embeddings of irregularly spaced events and their participants,” in ICLR, 2022.
  22. C. Qu, X. Tan, S. Xue, X. Shi, J. Zhang, and H. Mei, “Bellman meets hawkes: Model-based reinforcement learning via temporal point processes,” in aaai, 2023.
  23. Q. Zhang, A. Lipani, and E. Yilmaz, “Learning neural point processes with latent graphs,” in WWW, 2021.
  24. B. Knyazev, C. Augusta, and G. W. Taylor, “Learning temporal attention in dynamic graphs with bilinear interactions,” CoRR, 2019.
  25. M. Welling and T. N. Kipf, “Semi-supervised classification with graph convolutional networks,” in ICLR, 2016.
  26. H. Mei, G. Qin, and J. Eisner, “Imputing missing events in continuous-time event streams,” in ICML, 2019.
  27. A. van den Oord, Y. Li, and O. Vinyals, “Representation learning with contrastive predictive coding,” CoRR, vol. abs/1807.03748, 2018.
  28. Y. Zhou, Z. Chu, Y. Ruan, G. Jin, Y. Huang, and S. Li, “ptse: A multi-model ensemble method for probabilistic time series forecasting,” arXiv preprint arXiv:2305.11304, 2023.
  29. E. Jang, S. Gu, and B. Poole, “Categorical reparameterization with gumbel-softmax,” in ICLR, 2017.
  30. S. Xiao, J. Yan, X. Yang, H. Zha, and S. M. Chu, “Modeling the intensity function of point process via recurrent neural networks,” in AAAI, S. Singh and S. Markovitch, Eds., 2017.
  31. D. Luo, H. Xu, H. Zha, J. Du, R. Xie, X. Yang, and W. Zhang, “You are what you watch and when you watch: Inferring household structures from IPTV viewing data,” IEEE Trans. Broadcast., 2014.
  32. Z. Cheng, J. Caverlee, K. Y. Kamath, and K. Lee, “Toward traffic-driven location-based web search,” in CIKM, 2011.

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.