Convergence of gradient-based block coordinate descent algorithms for non-orthogonal joint approximate diagonalization of matrices (2009.13377v2)
Abstract: In this paper, we propose a gradient-based block coordinate descent (BCD-G) framework to solve the joint approximate diagonalization of matrices defined on the product of the complex Stiefel manifold and the special linear group. Instead of the cyclic fashion, we choose a block optimization based on the Riemannian gradient. To update the first block variable in the complex Stiefel manifold, we use the well-known line search descent method. To update the second block variable in the special linear group, based on four kinds of different elementary transformations, we construct three classes: GLU, GQU and GU, and then get three BCD-G algorithms: BCD-GLU, BCD-GQU and BCD-GU. We establish the global and weak convergence of these three algorithms using the \L{}ojasiewicz gradient inequality under the assumption that the iterates are bounded. We also propose a gradient-based Jacobi-type framework to solve the joint approximate diagonalization of matrices defined on the special linear group. As in the BCD-G case, using the GLU and GQU classes of elementary transformations, we focus on the Jacobi-GLU and Jacobi-GQU algorithms and establish their global and weak convergence. All the algorithms and convergence results described in this paper also apply to the real case.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.