Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Self-supervised representation learning from electroencephalography signals (1911.05419v1)

Published 13 Nov 2019 in cs.LG, eess.SP, and stat.ML

Abstract: The supervised learning paradigm is limited by the cost - and sometimes the impracticality - of data collection and labeling in multiple domains. Self-supervised learning, a paradigm which exploits the structure of unlabeled data to create learning problems that can be solved with standard supervised approaches, has shown great promise as a pretraining or feature learning approach in fields like computer vision and time series processing. In this work, we present self-supervision strategies that can be used to learn informative representations from multivariate time series. One successful approach relies on predicting whether time windows are sampled from the same temporal context or not. As demonstrated on a clinically relevant task (sleep scoring) and with two electroencephalography datasets, our approach outperforms a purely supervised approach in low data regimes, while capturing important physiological information without any access to labels.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Hubert Banville (9 papers)
  2. Isabela Albuquerque (17 papers)
  3. Graeme Moffat (1 paper)
  4. Denis-Alexander Engemann (3 papers)
  5. Alexandre Gramfort (105 papers)
  6. Aapo Hyvärinen (28 papers)
Citations (55)

Summary

We haven't generated a summary for this paper yet.