Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 77 tok/s
Gemini 2.5 Pro 45 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 28 tok/s Pro
GPT-4o 122 tok/s Pro
Kimi K2 178 tok/s Pro
GPT OSS 120B 450 tok/s Pro
Claude Sonnet 4.5 34 tok/s Pro
2000 character limit reached

Optimized Projections for Compressed Sensing via Direct Mutual Coherence Minimization (1508.03117v3)

Published 13 Aug 2015 in cs.IT, cs.LG, and math.IT

Abstract: Compressed Sensing (CS) is a novel technique for simultaneous signal sampling and compression based on the existence of a sparse representation of signal and a projected dictionary $PD$, where $P\in\mathbb{R}{m\times d}$ is the projection matrix and $D\in\mathbb{R}{d\times n}$ is the dictionary. To exactly recover the signal with a small number of measurements $m$, the projected dictionary $PD$ is expected to be of low mutual coherence. Several previous methods attempt to find the projection $P$ such that the mutual coherence of $PD$ can be as low as possible. However, they do not minimize the mutual coherence directly and thus their methods are far from optimal. Also the solvers they used lack of the convergence guarantee and thus there has no guarantee on the quality of their obtained solutions. This work aims to address these issues. We propose to find an optimal projection by minimizing the mutual coherence of $PD$ directly. This leads to a nonconvex nonsmooth minimization problem. We then approximate it by smoothing and solve it by alternate minimization. We further prove the convergence of our algorithm. To the best of our knowledge, this is the first work which directly minimizes the mutual coherence of the projected dictionary with a convergence guarantee. Numerical experiments demonstrate that the proposed method can recover sparse signals better than existing methods.

Citations (61)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.