Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 172 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 29 tok/s Pro
GPT-4o 103 tok/s Pro
Kimi K2 199 tok/s Pro
GPT OSS 120B 464 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Improving Time Series Classification Algorithms Using Octave-Convolutional Layers (2109.13696v1)

Published 28 Sep 2021 in cs.LG

Abstract: Deep learning models utilizing convolution layers have achieved state-of-the-art performance on univariate time series classification tasks. In this work, we propose improving CNN based time series classifiers by utilizing Octave Convolutions (OctConv) to outperform themselves. These network architectures include Fully Convolutional Networks (FCN), Residual Neural Networks (ResNets), LSTM-Fully Convolutional Networks (LSTM-FCN), and Attention LSTM-Fully Convolutional Networks (ALSTM-FCN). The proposed layers significantly improve each of these models with minimally increased network parameters. In this paper, we experimentally show that by substituting convolutions with OctConv, we significantly improve accuracy for time series classification tasks for most of the benchmark datasets. In addition, the updated ALSTM-OctFCN performs statistically the same as the top two time series classifers, TS-CHIEF and HIVE-COTE (both ensemble models). To further explore the impact of the OctConv layers, we perform ablation tests of the augmented model compared to their base model.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.