Emergent Mind

Abstract

The past few years have witnessed the rapid development in multivariate time series forecasting. The key to accurate forecasting results is capturing the long-term dependency between each time step (cross-time dependency) and modeling the complex dependency between each variable (cross-variable dependency) in multivariate time series. However, recent methods mainly focus on the cross-time dependency but seldom consider the cross-variable dependency. To fill this gap, we find that convolution, a traditional technique but recently losing steam in time series forecasting, meets the needs of respectively capturing the cross-time and cross-variable dependency. Based on this finding, we propose a modern pure convolution structure, namely Cross-LKTCN, to better utilize both cross-time and cross-variable dependency for time series forecasting. Specifically in each Cross-LKTCN block, a depth-wise large kernel convolution with large receptive field is proposed to capture cross-time dependency, and then two successive point-wise group convolution feed forward networks are proposed to capture cross-variable dependency. Experimental results on real-world benchmarks show that Cross-LKTCN achieves state-of-the-art forecasting performance and improves the forecasting accuracy significantly compared with existing convolutional-based models and cross-variable methods.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.