DynaConF: Dynamic Forecasting of Non-Stationary Time Series (2209.08411v3)
Abstract: Deep learning has shown impressive results in a variety of time series forecasting tasks, where modeling the conditional distribution of the future given the past is the essence. However, when this conditional distribution is non-stationary, it poses challenges for these models to learn consistently and to predict accurately. In this work, we propose a new method to model non-stationary conditional distributions over time by clearly decoupling stationary conditional distribution modeling from non-stationary dynamics modeling. Our method is based on a Bayesian dynamic model that can adapt to conditional distribution changes and a deep conditional distribution model that handles multivariate time series using a factorized output space. Our experimental results on synthetic and real-world datasets show that our model can adapt to non-stationary time series better than state-of-the-art deep learning solutions.
- Diffusion-based time series imputation and forecasting with structured state space models. Transactions on Machine Learning Research, 2022.
- GluonTS: Probabilistic and neural time series modeling in python. Journal of Machine Learning Research, 2020.
- Deep explicit duration switching models for time series. In Advances in Neural Information Processing Systems, 2021.
- An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. http://arxiv.org/abs/1803.01271, 2018.
- Forecasting across time series databases using recurrent neural networks on groups of similar series: A clustering approach. Expert Systems with Applications, 2020.
- Tim Bollerslev. Generalized autoregressive conditional heteroskedasticity. Journal of Econometrics, 1986.
- Time Series Analysis: Forecasting and Control. 2015.
- Time Series: Theory and Methods. 2009.
- Nonstationary temporal matrix factorization for multivariate time series forecasting. http://arxiv.org/abs/2203.10651, 2022.
- STL: A seasonal-trend decomposition procedure based on loess. Journal of Official Statistics, 1990.
- Time series forecasting with Gaussian processes needs priors. In Joint European Conference on Machine Learning and Knowledge Discovery in Databases, 2021.
- Normalizing Kalman filters for multivariate time series analysis. In Advances in Neural Information Processing Systems, 2020.
- A continual learning survey: Defying forgetting in classification tasks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021.
- Distribution of the estimators for autoregressive time series with a unit root. Journal of the American Statistical Association, 1979.
- Rao-Blackwellised particle filtering for dynamic Bayesian networks. In Proceedings of the Sixteenth Conference on Uncertainty in Artificial Intelligence, 2000.
- TACTiS: Transformer-attentional copulas for time series. In Proceedings of the 39th International Conference on Machine Learning, 2022.
- Robert F. Engle. Autoregressive conditional heteroscedasticity with estimates of the variance of united kingdom inflation. Econometrica: Journal of the Econometric Society, 1982.
- A disentangled recognition and nonlinear dynamics model for unsupervised learning. In Advances in Neural Information Processing Systems, 2017.
- Conditional neural processes. In Proceedings of the 35th International Conference on Machine Learning, 2018a.
- Neural processes. http://arxiv.org/abs/1807.01622, 2018b.
- Continual learning for multivariate time series tasks with variable input dimensions. In 2021 IEEE International Conference on Data Mining, 2021.
- James Douglas Hamilton. Time Series Analysis. 1994.
- Long short-term memory. Neural Computation, 1997.
- Charles C. Holt. Forecasting seasonals and trends by exponentially weighted moving averages. International Journal of Forecasting, 2004.
- Re E Kalman. A new approach to linear filtering and prediction problems. Transactions of the ASME-Journal of Basic Engineering, 1960.
- Reversible instance normalization for accurate time-series forecasting against distribution shift. In International Conference on Learning Representations, 2022.
- Adam: A method for stochastic optimization. http://arxiv.org/abs/1412.6980, 2014.
- Auto-encoding variational Bayes. http://arxiv.org/abs/1312.6114, 2013.
- Overcoming catastrophic forgetting in neural networks. Proceedings of the National Academy of Sciences, 2017.
- Latent matters: Learning deep state-space models. In Advances in Neural Information Processing Systems, 2021.
- Continual learning with Bayesian neural networks for non-stationary data. In International Conference on Learning Representations, 2019.
- Deep Rao-Blackwellised particle filters for time series forecasting. In Advances in Neural Information Processing Systems, 2020.
- Testing the null hypothesis of stationarity against the alternative of a unit root: How sure are we that economic time series have a unit root? Journal of Econometrics, 1992.
- Modeling long-and short-term temporal patterns with deep neural networks. In The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, 2018.
- Probabilistic time series forecasting with shape and temporal diversity. In Advances in Neural Information Processing Systems, 2020.
- Time series forecasting with deep learning: A survey. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 2021.
- Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting. In Advances in Neural Information Processing Systems, 2022.
- Koopa: Learning non-stationary time series dynamics with Koopman predictors. http://arxiv.org/abs/2305.18803, 2023.
- Scoring rules for continuous probability distributions. Management Science, 1976.
- Variational continual learning. In International Conference on Learning Representations, 2018.
- Temporal latent auto-encoder: A method for probabilistic multivariate time series forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, 2021.
- A time series is worth 64 words: Long-term forecasting with transformers. In International Conference on Learning Representations, 2022.
- N-BEATS: Neural basis expansion analysis for interpretable time series forecasting. In International Conference on Learning Representations, 2019.
- Continual lifelong learning with neural networks: A review. Neural Networks, 2019.
- Deep state space models for time series forecasting. In Advances in Neural Information Processing Systems, 2018.
- Autoregressive denoising diffusion models for multivariate probabilistic time series forecasting. In Proceedings of the 38th International Conference on Machine Learning, 2021a.
- Multivariate probabilistic time series forecasting via conditioned normalizing flows. In International Conference on Learning Representations, 2021b.
- High-dimensional multivariate forecasting with low-rank Gaussian copula processes. In Advances in Neural Information Processing Systems, 2019.
- DeepAR: Probabilistic forecasting with autoregressive recurrent networks. International Journal of Forecasting, 2020.
- Slawek Smyl. A hybrid method of exponential smoothing and recurrent neural networks for time series forecasting. International Journal of Forecasting, 2020.
- Probabilistic transformer for time series analysis. In Advances in Neural Information Processing Systems, 2021.
- CSDI: Conditional score-based diffusion models for probabilistic time series imputation. In Advances in Neural Information Processing Systems, 2021.
- Attention is all you need. In Advances in Neural Information Processing Systems, 2017.
- ETSformer: Exponential smoothing transformers for time-series forecasting. http://arxiv.org/abs/2202.01381, 2022.
- Learning deep time-index models for time series forecasting. In Proceedings of the 40th International Conference on Machine Learning, 2023.
- Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. In Advances in Neural Information Processing Systems, 2021.
- TimesNet: Temporal 2D-variation modeling for general time series analysis. In International Conference on Learning Representations, 2022.
- Stanza: A nonlinear state space model for probabilistic inference in non-stationary time series. http://arxiv.org/abs/2006.06553, 2020.
- Whittle networks: A deep likelihood model for time series. In Proceedings of the 38th International Conference on Machine Learning, 2021.