Emergent Mind

Iterated Gauss-Seidel GMRES

(2205.07805)
Published May 16, 2022 in math.NA and cs.NA

Abstract

The GMRES algorithm of Saad and Schultz (1986) is an iterative method for approximately solving linear systems $A{\bf x}={\bf b}$, with initial guess ${\bf x}0$ and residual ${\bf r}0 = {\bf b} - A{\bf x}0$. The algorithm employs the Arnoldi process to generate the Krylov basis vectors (the columns of $Vk$). It is well known that this process can be viewed as a $QR$ factorization of the matrix $Bk = [: {\bf r}0, AVk:]$ at each iteration. Despite an ${O}(\epsilon)\kappa(Bk)$ loss of orthogonality, for unit roundoff $\epsilon$ and condition number $\kappa$, the modified Gram-Schmidt formulation was shown to be backward stable in the seminal paper by Paige et al. (2006). We present an iterated Gauss-Seidel formulation of the GMRES algorithm (IGS-GMRES) based on the ideas of Ruhe (1983) and \'{S}wirydowicz et al. (2020). IGS-GMRES maintains orthogonality to the level ${O}(\epsilon)\kappa(Bk)$ or ${O}(\epsilon)$, depending on the choice of one or two iterations; for two Gauss-Seidel iterations, the computed Krylov basis vectors remain orthogonal to working precision and the smallest singular value of $Vk$ remains close to one. The resulting GMRES method is thus backward stable. We show that IGS-GMRES can be implemented with only a single synchronization point per iteration, making it relevant to large-scale parallel computing environments. We also demonstrate that, unlike MGS-GMRES, in IGS-GMRES the relative Arnoldi residual corresponding to the computed approximate solution no longer stagnates above machine precision even for highly non-normal systems.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.