Emergent Mind

Abstract

We consider the problem of linear regression where the $\ell2n$ norm loss (i.e., the usual least squares loss) is replaced by the $\ellpn$ norm. We show how to solve such problems up to machine precision in $O*(n{|1/2 - 1/p|})$ (dense) matrix-vector products and $O*(1)$ matrix inversions, or alternatively in $O*(n{|1/2 - 1/p|})$ calls to a (sparse) linear system solver. This improves the state of the art for any $p\not\in {1,2,+\infty}$. Furthermore we also propose a randomized algorithm solving such problems in {\em input sparsity time}, i.e., $O*(Z + \mathrm{poly}(d))$ where $Z$ is the size of the input and $d$ is the number of variables. Such a result was only known for $p=2$. Finally we prove that these results lie outside the scope of the Nesterov-Nemirovski's theory of interior point methods by showing that any symmetric self-concordant barrier on the $\ell_pn$ unit ball has self-concordance parameter $\tilde{\Omega}(n)$.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.