Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficient Discrepancy Testing for Learning with Distribution Shift (2406.09373v1)

Published 13 Jun 2024 in cs.DS and cs.LG

Abstract: A fundamental notion of distance between train and test distributions from the field of domain adaptation is discrepancy distance. While in general hard to compute, here we provide the first set of provably efficient algorithms for testing localized discrepancy distance, where discrepancy is computed with respect to a fixed output classifier. These results imply a broad set of new, efficient learning algorithms in the recently introduced model of Testable Learning with Distribution Shift (TDS learning) due to Klivans et al. (2023). Our approach generalizes and improves all prior work on TDS learning: (1) we obtain universal learners that succeed simultaneously for large classes of test distributions, (2) achieve near-optimal error rates, and (3) give exponential improvements for constant depth circuits. Our methods further extend to semi-parametric settings and imply the first positive results for low-dimensional convex sets. Additionally, we separate learning and testing phases and obtain algorithms that run in fully polynomial time at test time.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Gautam Chandrasekaran (7 papers)
  2. Adam R. Klivans (21 papers)
  3. Vasilis Kontonis (27 papers)
  4. Konstantinos Stavropoulos (23 papers)
  5. Arsen Vasilyan (17 papers)

Summary

We haven't generated a summary for this paper yet.