Emergent Mind

Robust PCA via Regularized REAPER with a Matrix-Free Proximal Algorithm

(2005.05449)
Published May 11, 2020 in math.NA and cs.NA

Abstract

Principal component analysis (PCA) is known to be sensitive to outliers, so that various robust PCA variants were proposed in the literature. A recent model, called REAPER, aims to find the principal components by solving a convex optimization problem. Usually the number of principal components must be determined in advance and the minimization is performed over symmetric positive semi-definite matrices having the size of the data, although the number of principal components is substantially smaller. This prohibits its use if the dimension of the data is large which is often the case in image processing. In this paper, we propose a regularized version of REAPER which enforces the sparsity of the number of principal components by penalizing the nuclear norm of the corresponding orthogonal projector. This has the advantage that only an upper bound on the number of principal components is required. Our second contribution is a matrix-free algorithm to find a minimizer of the regularized REAPER which is also suited for high dimensional data. The algorithm couples a primal-dual minimization approach with a thick-restarted Lanczos process. As a side result, we discuss the topic of the bias in robust PCA. Numerical examples demonstrate the performance of our algorithm.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.