Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

FTMixer: Frequency and Time Domain Representations Fusion for Time Series Modeling (2405.15256v2)

Published 24 May 2024 in cs.LG

Abstract: Time series data can be represented in both the time and frequency domains, with the time domain emphasizing local dependencies and the frequency domain highlighting global dependencies. To harness the strengths of both domains in capturing local and global dependencies, we propose the Frequency and Time Domain Mixer (FTMixer). To exploit the global characteristics of the frequency domain, we introduce the Frequency Channel Convolution (FCC) module, designed to capture global inter-series dependencies. Inspired by the windowing concept in frequency domain transformations, we present the Windowing Frequency Convolution (WFC) module to capture local dependencies. The WFC module first applies frequency transformation within each window, followed by convolution across windows. Furthermore, to better capture these local dependencies, we employ channel-independent scheme to mix the time domain and frequency domain patches. Notably, FTMixer employs the Discrete Cosine Transformation (DCT) with real numbers instead of the complex-number-based Discrete Fourier Transformation (DFT), enabling direct utilization of modern deep learning operators in the frequency domain. Extensive experimental results across seven real-world long-term time series datasets demonstrate the superiority of FTMixer, in terms of both forecasting performance and computational efficiency.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (42)
  1. Discrete cosine transform. IEEE transactions on Computers, 100(1):90–93, 1974.
  2. N-hits: Neural hierarchical interpolation for time series forecasting, 2022.
  3. Convtimenet: A deep hierarchical fully convolutional model for multivariate time series analysis. arXiv preprint arXiv:2403.01493, 2024.
  4. Long-term forecasting with tide: Time-series dense encoder. arXiv preprint arXiv:2304.08424, 2023.
  5. Tslanet: Rethinking transformers for time series representation learning. In International Conference on Machine Learning, 2024.
  6. Softs: Efficient multivariate time series forecasting with series-core fusion, 2024.
  7. Reversible instance normalization for accurate time-series forecasting against distribution shift. In International Conference on Learning Representations, 2021.
  8. Modeling long- and short-term temporal patterns with deep neural networks, 2018.
  9. Difformer: Multi-resolutional differencing transformer with dynamic ranging for time series analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(11):13586–13598, 2023. doi: 10.1109/TPAMI.2023.3293516.
  10. When and how: Learning identifiable latent states for nonstationary time series forecasting, 2024.
  11. Temporal fusion transformers for interpretable multi-horizon time series forecasting, 2020.
  12. Mamba4rec: Towards efficient sequential recommendation with selective state space models, 2024a.
  13. Scinet: Time series modeling and forecasting with sample convolution and interaction. Advances in Neural Information Processing Systems, 35:5816–5828, 2022.
  14. itransformer: Inverted transformers are effective for time series forecasting. arXiv preprint arXiv:2310.06625, 2023a.
  15. Koopa: Learning non-stationary time series dynamics with koopman predictors, 2023b.
  16. Non-stationary transformers: Exploring the stationarity in time series forecasting, 2023c.
  17. Timer: Transformers for time series analysis at scale, 2024b.
  18. D. Luo and X. Wang. Moderntcn: A modern pure convolution structure for general time series analysis. In The Twelfth International Conference on Learning Representations, 2024.
  19. Basisformer: Attention-based time series forecasting with learnable and interpretable basis, 2024.
  20. A time series is worth 64 words: Long-term forecasting with transformers. arXiv preprint arXiv:2211.14730, 2022.
  21. N-beats: Neural basis expansion analysis for interpretable time series forecasting, 2020.
  22. Pdetime: Rethinking long-term multivariate time series forecasting from the perspective of partial differential equations, 2024.
  23. R. Stasiński. Dct computation using real-valued dft algorithms. In 2002 11th European Signal Processing Conference, pages 1–4, 2002.
  24. W. Toner and L. Darlow. An analysis of linear time series forecasting models, 2024.
  25. A. Trindade. ElectricityLoadDiagrams20112014. UCI Machine Learning Repository, 2015. DOI:https://doi.org/10.24432/C58C86.
  26. Micn: Multi-scale local and global context modeling for long-term series forecasting. In The Eleventh International Conference on Learning Representations, 2022.
  27. Fredf: Learning to forecast in frequency domain. arXiv preprint arXiv:2402.02399, 2024.
  28. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Advances in neural information processing systems, 34:22419–22430, 2021.
  29. Timesnet: Temporal 2d-variation modeling for general time series analysis. In The eleventh international conference on learning representations, 2022.
  30. Learning in the frequency domain. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 1740–1749, 2020.
  31. Fits: Modeling time series with 10⁢k10𝑘10k10 italic_k parameters. arXiv preprint arXiv:2307.03756, 2023.
  32. Card: Channel aligned robust blend transformer for time series forecasting, 2024.
  33. Uncovering selective state space model’s capabilities in lifelong sequential recommendation, 2024.
  34. Atfnet: Adaptive time-frequency ensembled network for long-term time series forecasting, 2024a.
  35. A survey of time series foundation models: Generalizing time series representation with large language model, 2024b.
  36. Frequency-domain mlps are more effective learners in time series forecasting, 2023.
  37. Are transformers effective for time series forecasting?, 2022.
  38. Y. Zhang and J. Yan. Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting. In The eleventh international conference on learning representations, 2022.
  39. Informer: Beyond efficient transformer for long sequence time-series forecasting. In The Thirty-Fifth AAAI Conference on Artificial Intelligence, AAAI 2021, Virtual Conference, volume 35, pages 11106–11115. AAAI Press, 2021.
  40. Film: Frequency improved legendre memory model for long-term time series forecasting, 2022a.
  41. Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting. In International conference on machine learning, pages 27268–27286. PMLR, 2022b.
  42. One fits all:power general time series analysis by pretrained lm, 2023.
Citations (2)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets