Emergent Mind

Rank-$1$ matrix differential equations for structured eigenvalue optimization

(2206.09338)
Published Jun 19, 2022 in math.NA , cs.NA , math.DS , and math.OC

Abstract

A new approach to solving eigenvalue optimization problems for large structured matrices is proposed and studied. The class of optimization problems considered is related to computing structured pseudospectra and their extremal points, and to structured matrix nearness problems such as computing the structured distance to instability or to singularity. The structure can be a general linear structure and includes, for example, large matrices with a given sparsity pattern, matrices with given range and co-range, and Hamiltonian matrices. Remarkably, the eigenvalue optimization can be performed on the manifold of complex (or real) rank-1 matrices, which yields a significant reduction of storage and in some cases of the computational cost. The method relies on a constrained gradient system and the projection of the gradient onto the tangent space of the manifold of complex rank-$1$ matrices. It is shown that near a local minimizer this projection is very close to the identity map, and so the computationally favorable rank-1 projected system behaves locally like the %computationally expensive gradient system.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.