Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 45 tok/s
Gemini 2.5 Pro 54 tok/s Pro
GPT-5 Medium 22 tok/s Pro
GPT-5 High 20 tok/s Pro
GPT-4o 99 tok/s Pro
Kimi K2 183 tok/s Pro
GPT OSS 120B 467 tok/s Pro
Claude Sonnet 4 38 tok/s Pro
2000 character limit reached

Bayesian compressed sensing with new sparsity-inducing prior (1208.6464v1)

Published 31 Aug 2012 in cs.IT and math.IT

Abstract: Sparse Bayesian learning (SBL) is a popular approach to sparse signal recovery in compressed sensing (CS). In SBL, the signal sparsity information is exploited by assuming a sparsity-inducing prior for the signal that is then estimated using Bayesian inference. In this paper, a new sparsity-inducing prior is introduced and efficient algorithms are developed for signal recovery. The main algorithm is shown to produce a sparser solution than existing SBL methods while preserving their desirable properties. Numerical simulations with one-dimensional synthetic signals and two-dimensional images verify our analysis and show that for sparse signals the proposed algorithm outperforms its SBL peers in both the signal recovery accuracy and computational speed. Its improved performance is also demonstrated in comparison with other state-of-the-art methods in CS.

Citations (9)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.