Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

MetaBalance: High-Performance Neural Networks for Class-Imbalanced Data (2106.09643v1)

Published 17 Jun 2021 in cs.AI

Abstract: Class-imbalanced data, in which some classes contain far more samples than others, is ubiquitous in real-world applications. Standard techniques for handling class-imbalance usually work by training on a re-weighted loss or on re-balanced data. Unfortunately, training overparameterized neural networks on such objectives causes rapid memorization of minority class data. To avoid this trap, we harness meta-learning, which uses both an ''outer-loop'' and an ''inner-loop'' loss, each of which may be balanced using different strategies. We evaluate our method, MetaBalance, on image classification, credit-card fraud detection, loan default prediction, and facial recognition tasks with severely imbalanced data, and we find that MetaBalance outperforms a wide array of popular re-sampling strategies.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Arpit Bansal (17 papers)
  2. Micah Goldblum (96 papers)
  3. Valeriia Cherepanova (16 papers)
  4. Avi Schwarzschild (35 papers)
  5. C. Bayan Bruss (22 papers)
  6. Tom Goldstein (226 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.