Emergent Mind

Abstract

Linear systems in applications are typically well-posed, and yet the coefficient matrices may be nearly singular in that the condition number $\kappa(\boldsymbol{A})$ may be close to $1/\varepsilon{w}$, where $\varepsilon{w}$ denotes the unit roundoff of the working precision. It is well known that iterative refinement (IR) can make the forward error independent of $\kappa(\boldsymbol{A})$ if $\kappa(\boldsymbol{A})$ is sufficiently smaller than $1/\varepsilon_{w}$ and the residual is computed in higher precision. We propose a new iterative method, called Forward-and-Backward Stabilized Minimal Residual or FBSMR, by conceptually hybridizing right-preconditioned GMRES (RP-GMRES) with quasi-minimization. We develop FBSMR based on a new theoretical framework of essential-forward-and-backward stability (EFBS), which extends the backward error analysis to consider the intrinsic condition number of a well-posed problem. We stabilize the forward and backward errors in RP-GMRES to achieve EFBS by evaluating a small portion of the algorithm in higher precision while evaluating the preconditioner in lower precision. FBSMR can achieve optimal accuracy in terms of both forward and backward errors for well-posed problems with unpolluted matrices, independently of $\kappa(\boldsymbol{A})$. With low-precision preconditioning, FBSMR can reduce the computational, memory, and energy requirements over direct methods with or without IR. FBSMR can also leverage parallelization-friendly classical Gram-Schmidt in Arnoldi iterations without compromising EFBS. We demonstrate the effectiveness of FBSMR using both random and realistic linear systems.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.