Emergent Mind

Block Coordinate Descent for Sparse NMF

(1301.3527)
Published Jan 15, 2013 in cs.LG and cs.NA

Abstract

Nonnegative matrix factorization (NMF) has become a ubiquitous tool for data analysis. An important variant is the sparse NMF problem which arises when we explicitly require the learnt features to be sparse. A natural measure of sparsity is the L$0$ norm, however its optimization is NP-hard. Mixed norms, such as L$1$/L$2$ measure, have been shown to model sparsity robustly, based on intuitive attributes that such measures need to satisfy. This is in contrast to computationally cheaper alternatives such as the plain L$1$ norm. However, present algorithms designed for optimizing the mixed norm L$1$/L$2$ are slow and other formulations for sparse NMF have been proposed such as those based on L$1$ and L$0$ norms. Our proposed algorithm allows us to solve the mixed norm sparsity constraints while not sacrificing computation time. We present experimental evidence on real-world datasets that shows our new algorithm performs an order of magnitude faster compared to the current state-of-the-art solvers optimizing the mixed norm and is suitable for large-scale datasets.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.