Emergent Mind

Abstract

Tensor robust principal component analysis (TRPCA) is a fundamental model in machine learning and computer vision. Recently, tensor train (TT) decomposition has been verified effective to capture the global low-rank correlation for tensor recovery tasks. However, due to the large-scale tensor data in real-world applications, previous TRPCA models often suffer from high computational complexity. In this letter, we propose an efficient TRPCA under hybrid model of Tucker and TT. Specifically, in theory we reveal that TT nuclear norm (TTNN) of the original big tensor can be equivalently converted to that of a much smaller tensor via a Tucker compression format, thereby significantly reducing the computational cost of singular value decomposition (SVD). Numerical experiments on both synthetic and real-world tensor data verify the superiority of the proposed model.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.