Emergent Mind

Homotopy based algorithms for $\ell_0$-regularized least-squares

(1406.4802)
Published Jan 31, 2014 in cs.NA and cs.LG

Abstract

Sparse signal restoration is usually formulated as the minimization of a quadratic cost function $|y-Ax|22$, where A is a dictionary and x is an unknown sparse vector. It is well-known that imposing an $\ell0$ constraint leads to an NP-hard minimization problem. The convex relaxation approach has received considerable attention, where the $\ell0$-norm is replaced by the $\ell1$-norm. Among the many efficient $\ell1$ solvers, the homotopy algorithm minimizes $|y-Ax|22+\lambda|x|_1$ with respect to x for a continuum of $\lambda$'s. It is inspired by the piecewise regularity of the $\ell1$-regularization path, also referred to as the homotopy path. In this paper, we address the minimization problem $|y-Ax|22+\lambda|x|_0$ for a continuum of $\lambda$'s and propose two heuristic search algorithms for $\ell0$-homotopy. Continuation Single Best Replacement is a forward-backward greedy strategy extending the Single Best Replacement algorithm, previously proposed for $\ell0$-minimization at a given $\lambda$. The adaptive search of the $\lambda$-values is inspired by $\ell1$-homotopy. $\ell0$ Regularization Path Descent is a more complex algorithm exploiting the structural properties of the $\ell_0$-regularization path, which is piecewise constant with respect to $\lambda$. Both algorithms are empirically evaluated for difficult inverse problems involving ill-conditioned dictionaries. Finally, we show that they can be easily coupled with usual methods of model order selection.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.