Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 161 tok/s
Gemini 2.5 Pro 40 tok/s Pro
GPT-5 Medium 26 tok/s Pro
GPT-5 High 27 tok/s Pro
GPT-4o 117 tok/s Pro
Kimi K2 149 tok/s Pro
GPT OSS 120B 440 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Hierarchical Orthogonal Factorization: Sparse Square matrices (2010.06807v1)

Published 14 Oct 2020 in math.NA and cs.NA

Abstract: In this work, we develop a new fast algorithm, spaQR -- sparsified QR, for solving large, sparse linear systems. The key to our approach is using low-rank approximations to sparsify the separators in a Nested Dissection based Householder QR factorization. First, a modified version of Nested Dissection is used to identify interiors/separators and reorder the matrix. Then, classical Householder QR is used to factorize the interiors, going from the leaves to the root to the elimination tree. After every level of interior factorization, we sparsify the remaining separators by using low-rank approximations. This operation reduces the size of the separators without introducing any fill-in in the matrix. However, it introduces a small approximation error which can be controlled by the user. The resulting approximate factorization is stored as a sequence of sparse orthogonal and sparse upper-triangular factors. Hence, it can be applied efficiently to solve linear systems. Additionally, we further improve the algorithm by using a block diagonal scaling. Then, we show a systematic analysis of the approximation error and effectiveness of the algorithm in solving linear systems. Finally, we perform numerical tests on benchmark unsymmetric problems to evaluate the performance of the algorithm. The factorization time scales as $\mathcal{O}(N \log N)$ and the solve time scales as $\mathcal{O}(N)$.

Citations (5)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.