Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Simple yet Effective Baseline for Robust Deep Learning with Noisy Labels (1909.09338v2)

Published 20 Sep 2019 in cs.LG and stat.ML

Abstract: Recently deep neural networks have shown their capacity to memorize training data, even with noisy labels, which hurts generalization performance. To mitigate this issue, we provide a simple but effective baseline method that is robust to noisy labels, even with severe noise. Our objective involves a variance regularization term that implicitly penalizes the Jacobian norm of the neural network on the whole training set (including the noisy-labeled data), which encourages generalization and prevents overfitting to the corrupted labels. Experiments on both synthetically generated incorrect labels and realistic large-scale noisy datasets demonstrate that our approach achieves state-of-the-art performance with a high tolerance to severe noise.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Yucen Luo (12 papers)
  2. Jun Zhu (424 papers)
  3. Tomas Pfister (89 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.