Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 150 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 33 tok/s Pro
GPT-5 High 34 tok/s Pro
GPT-4o 113 tok/s Pro
Kimi K2 211 tok/s Pro
GPT OSS 120B 444 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Semi-Unsupervised Learning: Clustering and Classifying using Ultra-Sparse Labels (1901.08560v3)

Published 24 Jan 2019 in stat.ML and cs.LG

Abstract: In semi-supervised learning for classification, it is assumed that every ground truth class of data is present in the small labelled dataset. Many real-world sparsely-labelled datasets are plausibly not of this type. It could easily be the case that some classes of data are found only in the unlabelled dataset -- perhaps the labelling process was biased -- so we do not have any labelled examples to train on for some classes. We call this learning regime $\textit{semi-unsupervised learning}$, an extreme case of semi-supervised learning, where some classes have no labelled exemplars in the training set. First, we outline the pitfalls associated with trying to apply deep generative model (DGM)-based semi-supervised learning algorithms to datasets of this type. We then show how a combination of clustering and semi-supervised learning, using DGMs, can be brought to bear on this problem. We study several different datasets, showing how one can still learn effectively when half of the ground truth classes are entirely unlabelled and the other half are sparsely labelled.

Citations (4)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.