Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 147 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 23 tok/s Pro
GPT-5 High 26 tok/s Pro
GPT-4o 59 tok/s Pro
Kimi K2 190 tok/s Pro
GPT OSS 120B 446 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Multivariate Time Series Classification with Hierarchical Variational Graph Pooling (2010.05649v2)

Published 12 Oct 2020 in cs.LG and cs.AI

Abstract: With the advancement of sensing technology, multivariate time series classification (MTSC) has recently received considerable attention. Existing deep learning-based MTSC techniques, which mostly rely on convolutional or recurrent neural networks, are primarily concerned with the temporal dependency of single time series. As a result, they struggle to express pairwise dependencies among multivariate variables directly. Furthermore, current spatial-temporal modeling (e.g., graph classification) methodologies based on Graph Neural Networks (GNNs) are inherently flat and cannot aggregate hub data in a hierarchical manner. To address these limitations, we propose a novel graph pooling-based framework MTPool to obtain the expressive global representation of MTS. We first convert MTS slices to graphs by utilizing interactions of variables via graph structure learning module and attain the spatial-temporal graph node features via temporal convolutional module. To get global graph-level representation, we design an "encoder-decoder" based variational graph pooling module for creating adaptive centroids for cluster assignments. Then we combine GNNs and our proposed variational graph pooling layers for joint graph representation learning and graph coarsening, after which the graph is progressively coarsened to one node. At last, a differentiable classifier takes this coarsened representation to get the final predicted class. Experiments on ten benchmark datasets exhibit MTPool outperforms state-of-the-art strategies in the MTSC task.

Citations (48)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.