Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 52 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 18 tok/s Pro
GPT-5 High 13 tok/s Pro
GPT-4o 100 tok/s Pro
Kimi K2 192 tok/s Pro
GPT OSS 120B 454 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

DualDynamics: Synergizing Implicit and Explicit Methods for Robust Irregular Time Series Analysis (2401.04979v6)

Published 10 Jan 2024 in cs.LG and cs.AI

Abstract: Real-world time series analysis faces significant challenges when dealing with irregular and incomplete data. While Neural Differential Equation (NDE) based methods have shown promise, they struggle with limited expressiveness, scalability issues, and stability concerns. Conversely, Neural Flows offer stability but falter with irregular data. We introduce 'DualDynamics', a novel framework that synergistically combines NDE-based method and Neural Flow-based method. This approach enhances expressive power while balancing computational demands, addressing critical limitations of existing techniques. We demonstrate DualDynamics' effectiveness across diverse tasks: classification of robustness to dataset shift, irregularly-sampled series analysis, interpolation of missing data, and forecasting with partial observations. Our results show consistent outperformance over state-of-the-art methods, indicating DualDynamics' potential to advance irregular time series analysis significantly.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (34)
  1. The UEA multivariate time series classification archive, 2018. arXiv preprint arXiv:1811.00075.
  2. Invertible residual networks. In International Conference on Machine Learning, 573–582. PMLR.
  3. Neural flows: Efficient alternative to neural ODEs. Advances in Neural Information Processing Systems, 34: 21325–21337.
  4. Recurrent neural networks for multivariate time series with missing values. Scientific Reports, 8(1): 6085.
  5. Neural ordinary differential equations. Advances in neural information processing systems, 31.
  6. Doctor ai: Predicting clinical events via recurrent neural networks. In Machine learning for healthcare conference, 301–318. PMLR.
  7. Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555.
  8. GRU-ODE-Bayes: Continuous modeling of sporadically-observed time series. Advances in neural information processing systems, 32.
  9. Nice: Non-linear independent components estimation. arXiv preprint arXiv:1410.8516.
  10. Density estimation using real nvp. arXiv preprint arXiv:1605.08803.
  11. Regularisation of neural networks by enforcing lipschitz continuity. Machine Learning, 110: 393–416.
  12. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, 770–778.
  13. Long short-term memory. Neural computation, 9(8): 1735–1780.
  14. Kidger, P. 2022. On neural differential equations. arXiv preprint arXiv:2202.02435.
  15. Neural controlled differential equations for irregular time series. Advances in Neural Information Processing Systems, 33: 6696–6707.
  16. Normalizing flows: An introduction and review of current methods. IEEE transactions on pattern analysis and machine intelligence, 43(11): 3964–3979.
  17. Learning long-term dependencies in irregularly-sampled time series. arXiv preprint arXiv:2006.04418.
  18. Detecting and adapting to irregular distribution shifts in bayesian online learning. Advances in neural information processing systems.
  19. Tune: A Research Platform for Distributed Model Selection and Training. arXiv preprint arXiv:1807.05118.
  20. Beyond finite layer neural networks: Bridging deep architectures and numerical differential equations. In International Conference on Machine Learning, 3276–3285. PMLR.
  21. Stable neural flows. arXiv preprint arXiv:2003.08063.
  22. Dissecting neural odes. Advances in Neural Information Processing Systems, 33: 3952–3963.
  23. Recurrent neural networks: design and applications. CRC press.
  24. Ray: A distributed framework for emerging {{\{{AI}}\}} applications. In 13th {normal-{\{{USENIX}normal-}\}} Symposium on Operating Systems Design and Implementation ({normal-{\{{OSDI}normal-}\}} 18), 561–577.
  25. Neural rough differential equations for long time series. In International Conference on Machine Learning, 7829–7838. PMLR.
  26. Neural ODE Processes. In International Conference on Learning Representations.
  27. Unifying Neural Controlled Differential Equations and Neural Flow for Irregular Time Series Classification. In The Symbiosis of Deep Learning and Differential Equations III.
  28. Normalizing flows for probabilistic modeling and inference. The Journal of Machine Learning Research, 22(1): 2617–2680.
  29. Latent ordinary differential equations for irregularly-sampled time series. Advances in neural information processing systems, 32.
  30. Learning representations by back-propagating errors. nature, 323(6088): 533–536.
  31. Multi-time attention networks for irregularly sampled time series. arXiv preprint arXiv:2101.10318.
  32. Predicting in-hospital mortality of icu patients: The physionet/computing in cardiology challenge 2012. In 2012 Computing in Cardiology, 245–248. IEEE.
  33. Transport analysis of infinitely deep neural network. The Journal of Machine Learning Research, 20(1): 31–82.
  34. Domain adaptation under missingness shift. International Conference on Artificial Intelligence and Statistics.
Citations (1)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets