Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
166 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

DynaConF: Dynamic Forecasting of Non-Stationary Time Series (2209.08411v3)

Published 17 Sep 2022 in cs.LG and stat.ML

Abstract: Deep learning has shown impressive results in a variety of time series forecasting tasks, where modeling the conditional distribution of the future given the past is the essence. However, when this conditional distribution is non-stationary, it poses challenges for these models to learn consistently and to predict accurately. In this work, we propose a new method to model non-stationary conditional distributions over time by clearly decoupling stationary conditional distribution modeling from non-stationary dynamics modeling. Our method is based on a Bayesian dynamic model that can adapt to conditional distribution changes and a deep conditional distribution model that handles multivariate time series using a factorized output space. Our experimental results on synthetic and real-world datasets show that our model can adapt to non-stationary time series better than state-of-the-art deep learning solutions.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (59)
  1. Diffusion-based time series imputation and forecasting with structured state space models. Transactions on Machine Learning Research, 2022.
  2. GluonTS: Probabilistic and neural time series modeling in python. Journal of Machine Learning Research, 2020.
  3. Deep explicit duration switching models for time series. In Advances in Neural Information Processing Systems, 2021.
  4. An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. http://arxiv.org/abs/1803.01271, 2018.
  5. Forecasting across time series databases using recurrent neural networks on groups of similar series: A clustering approach. Expert Systems with Applications, 2020.
  6. Tim Bollerslev. Generalized autoregressive conditional heteroskedasticity. Journal of Econometrics, 1986.
  7. Time Series Analysis: Forecasting and Control. 2015.
  8. Time Series: Theory and Methods. 2009.
  9. Nonstationary temporal matrix factorization for multivariate time series forecasting. http://arxiv.org/abs/2203.10651, 2022.
  10. STL: A seasonal-trend decomposition procedure based on loess. Journal of Official Statistics, 1990.
  11. Time series forecasting with Gaussian processes needs priors. In Joint European Conference on Machine Learning and Knowledge Discovery in Databases, 2021.
  12. Normalizing Kalman filters for multivariate time series analysis. In Advances in Neural Information Processing Systems, 2020.
  13. A continual learning survey: Defying forgetting in classification tasks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021.
  14. Distribution of the estimators for autoregressive time series with a unit root. Journal of the American Statistical Association, 1979.
  15. Rao-Blackwellised particle filtering for dynamic Bayesian networks. In Proceedings of the Sixteenth Conference on Uncertainty in Artificial Intelligence, 2000.
  16. TACTiS: Transformer-attentional copulas for time series. In Proceedings of the 39th International Conference on Machine Learning, 2022.
  17. Robert F. Engle. Autoregressive conditional heteroscedasticity with estimates of the variance of united kingdom inflation. Econometrica: Journal of the Econometric Society, 1982.
  18. A disentangled recognition and nonlinear dynamics model for unsupervised learning. In Advances in Neural Information Processing Systems, 2017.
  19. Conditional neural processes. In Proceedings of the 35th International Conference on Machine Learning, 2018a.
  20. Neural processes. http://arxiv.org/abs/1807.01622, 2018b.
  21. Continual learning for multivariate time series tasks with variable input dimensions. In 2021 IEEE International Conference on Data Mining, 2021.
  22. James Douglas Hamilton. Time Series Analysis. 1994.
  23. Long short-term memory. Neural Computation, 1997.
  24. Charles C. Holt. Forecasting seasonals and trends by exponentially weighted moving averages. International Journal of Forecasting, 2004.
  25. Re E Kalman. A new approach to linear filtering and prediction problems. Transactions of the ASME-Journal of Basic Engineering, 1960.
  26. Reversible instance normalization for accurate time-series forecasting against distribution shift. In International Conference on Learning Representations, 2022.
  27. Adam: A method for stochastic optimization. http://arxiv.org/abs/1412.6980, 2014.
  28. Auto-encoding variational Bayes. http://arxiv.org/abs/1312.6114, 2013.
  29. Overcoming catastrophic forgetting in neural networks. Proceedings of the National Academy of Sciences, 2017.
  30. Latent matters: Learning deep state-space models. In Advances in Neural Information Processing Systems, 2021.
  31. Continual learning with Bayesian neural networks for non-stationary data. In International Conference on Learning Representations, 2019.
  32. Deep Rao-Blackwellised particle filters for time series forecasting. In Advances in Neural Information Processing Systems, 2020.
  33. Testing the null hypothesis of stationarity against the alternative of a unit root: How sure are we that economic time series have a unit root? Journal of Econometrics, 1992.
  34. Modeling long-and short-term temporal patterns with deep neural networks. In The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, 2018.
  35. Probabilistic time series forecasting with shape and temporal diversity. In Advances in Neural Information Processing Systems, 2020.
  36. Time series forecasting with deep learning: A survey. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 2021.
  37. Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting. In Advances in Neural Information Processing Systems, 2022.
  38. Koopa: Learning non-stationary time series dynamics with Koopman predictors. http://arxiv.org/abs/2305.18803, 2023.
  39. Scoring rules for continuous probability distributions. Management Science, 1976.
  40. Variational continual learning. In International Conference on Learning Representations, 2018.
  41. Temporal latent auto-encoder: A method for probabilistic multivariate time series forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, 2021.
  42. A time series is worth 64 words: Long-term forecasting with transformers. In International Conference on Learning Representations, 2022.
  43. N-BEATS: Neural basis expansion analysis for interpretable time series forecasting. In International Conference on Learning Representations, 2019.
  44. Continual lifelong learning with neural networks: A review. Neural Networks, 2019.
  45. Deep state space models for time series forecasting. In Advances in Neural Information Processing Systems, 2018.
  46. Autoregressive denoising diffusion models for multivariate probabilistic time series forecasting. In Proceedings of the 38th International Conference on Machine Learning, 2021a.
  47. Multivariate probabilistic time series forecasting via conditioned normalizing flows. In International Conference on Learning Representations, 2021b.
  48. High-dimensional multivariate forecasting with low-rank Gaussian copula processes. In Advances in Neural Information Processing Systems, 2019.
  49. DeepAR: Probabilistic forecasting with autoregressive recurrent networks. International Journal of Forecasting, 2020.
  50. Slawek Smyl. A hybrid method of exponential smoothing and recurrent neural networks for time series forecasting. International Journal of Forecasting, 2020.
  51. Probabilistic transformer for time series analysis. In Advances in Neural Information Processing Systems, 2021.
  52. CSDI: Conditional score-based diffusion models for probabilistic time series imputation. In Advances in Neural Information Processing Systems, 2021.
  53. Attention is all you need. In Advances in Neural Information Processing Systems, 2017.
  54. ETSformer: Exponential smoothing transformers for time-series forecasting. http://arxiv.org/abs/2202.01381, 2022.
  55. Learning deep time-index models for time series forecasting. In Proceedings of the 40th International Conference on Machine Learning, 2023.
  56. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. In Advances in Neural Information Processing Systems, 2021.
  57. TimesNet: Temporal 2D-variation modeling for general time series analysis. In International Conference on Learning Representations, 2022.
  58. Stanza: A nonlinear state space model for probabilistic inference in non-stationary time series. http://arxiv.org/abs/2006.06553, 2020.
  59. Whittle networks: A deep likelihood model for time series. In Proceedings of the 38th International Conference on Machine Learning, 2021.
Citations (2)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets