Emergent Mind

Provable Approximations for Constrained $\ell_p$ Regression

(1902.10407)
Published Feb 27, 2019 in cs.LG and stat.ML

Abstract

The $\ellp$ linear regression problem is to minimize $f(x)=||Ax-b||p$ over $x\in\mathbb{R}d$, where $A\in\mathbb{R}{n\times d}$, $b\in \mathbb{R}n$, and $p>0$. To avoid overfitting and bound $||x||2$, the constrained $\ellp$ regression minimizes $f(x)$ over every unit vector $x\in\mathbb{R}d$. This makes the problem non-convex even for the simplest case $d=p=2$. Instead, ridge regression is used to minimize the Lagrange form $f(x)+\lambda ||x||2$ over $x\in\mathbb{R}d$, which yields a convex problem in the price of calibrating the regularization parameter $\lambda>0$. We provide the first provable constant factor approximation algorithm that solves the constrained $\ellp$ regression directly, for every constant $p,d\geq 1$. Using core-sets, its running time is $O(n \log n)$ including extensions for streaming and distributed (big) data. In polynomial time, it can handle outliers, $p\in (0,1)$ and minimize $f(x)$ over every $x$ and permutation of rows in $A$. Experimental results are also provided, including open source and comparison to existing software.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.