Emergent Mind

Acceleration with a Ball Optimization Oracle

(2003.08078)
Published Mar 18, 2020 in math.OC and cs.DS

Abstract

Consider an oracle which takes a point $x$ and returns the minimizer of a convex function $f$ in an $\ell2$ ball of radius $r$ around $x$. It is straightforward to show that roughly $r{-1}\log\frac{1}{\epsilon}$ calls to the oracle suffice to find an $\epsilon$-approximate minimizer of $f$ in an $\ell2$ unit ball. Perhaps surprisingly, this is not optimal: we design an accelerated algorithm which attains an $\epsilon$-approximate minimizer with roughly $r{-2/3} \log \frac{1}{\epsilon}$ oracle queries, and give a matching lower bound. Further, we implement ball optimization oracles for functions with locally stable Hessians using a variant of Newton's method. The resulting algorithm applies to a number of problems of practical and theoretical import, improving upon previous results for logistic and $\ell\infty$ regression and achieving guarantees comparable to the state-of-the-art for $\ellp$ regression.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.