Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 29 tok/s Pro
GPT-5 High 39 tok/s Pro
GPT-4o 112 tok/s Pro
Kimi K2 188 tok/s Pro
GPT OSS 120B 442 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

CLeaRForecast: Contrastive Learning of High-Purity Representations for Time Series Forecasting (2312.05758v1)

Published 10 Dec 2023 in cs.LG and stat.AP

Abstract: Time series forecasting (TSF) holds significant importance in modern society, spanning numerous domains. Previous representation learning-based TSF algorithms typically embrace a contrastive learning paradigm featuring segregated trend-periodicity representations. Yet, these methodologies disregard the inherent high-impact noise embedded within time series data, resulting in representation inaccuracies and seriously demoting the forecasting performance. To address this issue, we propose CLeaRForecast, a novel contrastive learning framework to learn high-purity time series representations with proposed sample, feature, and architecture purifying methods. More specifically, to avoid more noise adding caused by the transformations of original samples (series), transformations are respectively applied for trendy and periodic parts to provide better positive samples with obviously less noise. Moreover, we introduce a channel independent training manner to mitigate noise originating from unrelated variables in the multivariate series. By employing a streamlined deep-learning backbone and a comprehensive global contrastive loss function, we prevent noise introduction due to redundant or uneven learning of periodicity and trend. Experimental results show the superior performance of CLeaRForecast in various downstream TSF tasks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (27)
  1. MSTL: A Seasonal-Trend Decomposition Algorithm for Time Series with Multiple Seasonal Patterns. arXiv:2107.13462.
  2. Time series analysis: forecasting and control. John Wiley & Sons.
  3. Time-Series Representation Learning via Temporal and Contextual Contrasting. In Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence, IJCAI-21, 2352–2359.
  4. Attention meets long short-term memory: A deep learning network for traffic flow forecasting. Physica A: Statistical Mechanics and its Applications, 587: 126485.
  5. An adaptive deep-learning load forecasting framework by integrating Transformer and domain knowledge. Advances in Applied Energy, 100142.
  6. Client: Cross-variable Linear Integrated Enhanced Transformer for Multivariate Long-Term Time Series Forecasting. arXiv:2305.18838.
  7. The Capacity and Robustness Trade-off: Revisiting the Channel Independent Strategy for Multivariate Time Series Forecasting. arXiv:2304.05206.
  8. Momentum Contrast for Unsupervised Visual Representation Learning. arXiv:1911.05722.
  9. Ridge regression: applications to nonorthogonal problems. Technometrics, 12(1): 69–82.
  10. Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting. In NeurIPS.
  11. Scinet: Time series modeling and forecasting with sample convolution and interaction. Advances in Neural Information Processing Systems, 35: 5816–5828.
  12. Can ChatGPT Forecast Stock Price Movements? Return Predictability and Large Language Models. arXiv preprint arXiv:2304.07619.
  13. A Time Series is Worth 64 Words: Long-term Forecasting with Transformers. arXiv preprint arXiv:2211.14730.
  14. Segal, M. R. 2004. Machine learning benchmarks and random forest regression.
  15. Think Globally, Act Locally: A Deep Neural Network Approach to High-Dimensional Time Series Forecasting. In NeurIPS.
  16. Sneddon, I. N. 1995. Fourier transforms. Courier Corporation.
  17. Linear regression. Wiley Interdisciplinary Reviews: Computational Statistics, 4(3): 275–294.
  18. Unsupervised Representation Learning for Time Series with Temporal Neighborhood Coding. In International Conference on Learning Representations.
  19. UCI. 2015. Electricity. https://archive.ics.uci.edu/ml/datasets/ElectricityLoadDiagrams20112014.
  20. WaveNet: A Generative Model for Raw Audio. In SSW.
  21. Visualizing data using t-SNE. Journal of machine learning research, 9(11).
  22. Wetterstation. 2020. Weather. https://www.ncei.noaa.gov/data/local-climatological-data/.
  23. CoST: Contrastive Learning of Disentangled Seasonal-Trend Representations for Time Series Forecasting. In International Conference on Learning Representations.
  24. TS2Vec: Towards Universal Representation of Time Series. arXiv:2106.10466.
  25. Are Transformers Effective for Time Series Forecasting?
  26. Self-Supervised Learning for Time Series Analysis: Taxonomy, Progress, and Prospects. arXiv:2306.10125.
  27. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. In AAAI.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube