Emergent Mind

Optimization methods for solving matrix equations

(2404.06030)
Published Apr 9, 2024 in math.NA , cs.NA , and math.OC

Abstract

In this paper, we focus on using optimization methods to solve matrix equations by transforming the problem of solving the Sylvester matrix equation or continuous algebraic Riccati equation into an optimization problem. Initially, we use a constrained convex optimization method (CCOM) to solve the Sylvester matrix equation with $\ell_{2,1}$-norm, where we provide a convergence analysis and numerical examples of CCOM; however, the results show that the algorithm is not efficient. To address this issue, we employ classical quasi-Newton methods such as DFP and BFGS algorithms to solve the Sylvester matrix equation and present the convergence and numerical results of the algorithm. Additionally, we compare these algorithms with the CG algorithm and AR algorithm, and our results demonstrate that the presented algorithms are effective. Furthermore, we propose a unified framework of the alternating direction multiplier method (ADMM) for directly solving the continuous algebraic Riccati equation (CARE), and we provide the convergence and numerical results of ADMM. Our experimental results indicate that ADMM is an effective optimization algorithm for solving CARE. Finally, to improve the effectiveness of the optimization method for solving Riccati equation, we propose the Newton-ADMM algorithm framework, where the outer iteration of this method is the classical Newton method, and the inner iteration involves using ADMM to solve Lyapunov matrix equations inexactly. We also provide the convergence and numerical results of this algorithm, which our results demonstrate are more efficient than ADMM for solving CARE.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.