Emergent Mind

Abstract

In this paper, we propose a general algorithmic framework to solve a class of optimization problems on the product of complex Stiefel manifolds based on the matrix polar decomposition. We establish the weak convergence, global convergence and linear convergence rate of this general algorithmic approach using the \L{}ojasiewicz gradient inequality and the Morse-Bott property. This general algorithm and its convergence results are applied to the simultaneous approximate tensor diagonalization and simultaneous approximate tensor compression, which include as special cases the low rank orthogonal approximation, best rank-1 approximation and low multilinear rank approximation for higher order complex tensors. We also present a symmetric variant of this general algorithm to solve a symmetric variant of this class of optimization models, which essentially optimizes over a single Stiefel manifold. We establish its weak convergence, global convergence and linear convergence rate in a similar way. This symmetric variant and its convergence results are applied to the simultaneous approximate symmetric tensor diagonalization, which includes as special cases the low rank symmetric orthogonal approximation and best symmetric rank-1 approximation for higher order complex symmetric tensors. It turns out that well-known algorithms such as LROAT, S-LROAT, HOPM, S-HOPM are all special cases of this general algorithmic framework and its symmetric variant, and our convergence results subsume the results found in the literature designed for those special cases. All the algorithms and convergence results in this paper also apply to the real case.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.