Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 39 tok/s Pro
GPT-5 High 27 tok/s Pro
GPT-4o 118 tok/s Pro
Kimi K2 181 tok/s Pro
GPT OSS 120B 429 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Faster $p$-Norm Regression Using Sparsity (2109.11537v2)

Published 23 Sep 2021 in cs.DS

Abstract: For a matrix $A\in \mathbb{R}{n\times d}$ with $n\geq d$, we consider the dual problems of $\min |Ax-b|pp, \, b\in \mathbb{R}n$ and $\min{A\top x=b} |x|_pp,\, b\in \mathbb{R}d$. We improve the runtimes for solving these problems to high accuracy for every $p>1$ for sufficiently sparse matrices. We show that recent progress on fast sparse linear solvers can be leveraged to obtain faster than matrix-multiplication algorithms for any $p > 1$, i.e., in time $\tilde{O}(pn\theta)$ for some $\theta < \omega$, the matrix multiplication constant. We give the first high-accuracy input sparsity $p$-norm regression algorithm for solving $\min |Ax-b|_pp$ with $1 < p \leq 2$, via a new row sampling theorem for the smoothed $p$-norm function. This algorithm runs in time $\tilde{O}(\text{nnz}(A) + d4)$ for any $1<p\leq 2$, and in time $\tilde{O}(\text{nnz}(A) + d\theta)$ for $p$ close to $2$, improving on the previous best bound where the exponent of $d$ grows with $\max{p, p/(p-1)}$.

Citations (6)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.