Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 77 tok/s
Gemini 2.5 Pro 33 tok/s Pro
GPT-5 Medium 25 tok/s Pro
GPT-5 High 27 tok/s Pro
GPT-4o 75 tok/s Pro
Kimi K2 220 tok/s Pro
GPT OSS 120B 465 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

An homotopy method for $\ell_p$ regression provably beyond self-concordance and in input-sparsity time (1711.01328v2)

Published 3 Nov 2017 in math.OC and cs.DS

Abstract: We consider the problem of linear regression where the $\ell_2n$ norm loss (i.e., the usual least squares loss) is replaced by the $\ell_pn$ norm. We show how to solve such problems up to machine precision in $O*(n{|1/2 - 1/p|})$ (dense) matrix-vector products and $O*(1)$ matrix inversions, or alternatively in $O*(n{|1/2 - 1/p|})$ calls to a (sparse) linear system solver. This improves the state of the art for any $p\not\in {1,2,+\infty}$. Furthermore we also propose a randomized algorithm solving such problems in {\em input sparsity time}, i.e., $O*(Z + \mathrm{poly}(d))$ where $Z$ is the size of the input and $d$ is the number of variables. Such a result was only known for $p=2$. Finally we prove that these results lie outside the scope of the Nesterov-Nemirovski's theory of interior point methods by showing that any symmetric self-concordant barrier on the $\ell_pn$ unit ball has self-concordance parameter $\tilde{\Omega}(n)$.

Citations (53)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.