Thinking Inside the Ball: Near-Optimal Minimization of the Maximal Loss (2105.01778v1)
Abstract: We characterize the complexity of minimizing $\max_{i\in[N]} f_i(x)$ for convex, Lipschitz functions $f_1,\ldots, f_N$. For non-smooth functions, existing methods require $O(N\epsilon{-2})$ queries to a first-order oracle to compute an $\epsilon$-suboptimal point and $\tilde{O}(N\epsilon{-1})$ queries if the $f_i$ are $O(1/\epsilon)$-smooth. We develop methods with improved complexity bounds of $\tilde{O}(N\epsilon{-2/3} + \epsilon{-8/3})$ in the non-smooth case and $\tilde{O}(N\epsilon{-2/3} + \sqrt{N}\epsilon{-1})$ in the $O(1/\epsilon)$-smooth case. Our methods consist of a recently proposed ball optimization oracle acceleration algorithm (which we refine) and a careful implementation of said oracle for the softmax function. We also prove an oracle complexity lower bound scaling as $\Omega(N\epsilon{-2/3})$, showing that our dependence on $N$ is optimal up to polylogarithmic factors.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.