Emergent Mind

Flexible Enlarged Conjugate Gradient Methods

(2305.19013)
Published May 30, 2023 in math.NA and cs.NA

Abstract

Enlarged Krylov subspace methods and their s-step versions were introduced [7] in the aim of reducing communication when solving systems of linear equations Ax = b. These enlarged CG methods consist of enlarging the Krylov subspace by a maximum of t vectors per iteration based on the domain decomposition of the graph of A. As for the s-step versions, s iterations of the enlarged Conjugate Gradient methods are merged in one iteration. The Enlarged CG methods and their s-step versions converge in less iterations than the classical CG, but at the expense of requiring more memory storage than CG. Thus in this paper we explore different options for reducing the memory requirements of these enlarged CG methods without affecting much their convergence.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.