Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Evidential uncertainty sampling for active learning (2309.12494v2)

Published 21 Sep 2023 in cs.LG

Abstract: Recent studies in active learning, particularly in uncertainty sampling, have focused on the decomposition of model uncertainty into reducible and irreducible uncertainties. In this paper, the aim is to simplify the computational process while eliminating the dependence on observations. Crucially, the inherent uncertainty in the labels is considered, the uncertainty of the oracles. Two strategies are proposed, sampling by Klir uncertainty, which tackles the exploration-exploitation dilemma, and sampling by evidential epistemic uncertainty, which extends the concept of reducible uncertainty within the evidential framework, both using the theory of belief functions. Experimental results in active learning demonstrate that our proposed method can outperform uncertainty sampling.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Arthur Hoarau (4 papers)
  2. Vincent Lemaire (46 papers)
  3. Arnaud Martin (64 papers)
  4. Jean-Christophe Dubois (7 papers)
  5. Yolande Le Gall (7 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.