Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 153 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 28 tok/s Pro
GPT-5 High 18 tok/s Pro
GPT-4o 100 tok/s Pro
Kimi K2 220 tok/s Pro
GPT OSS 120B 447 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

EM Based p-norm-like Constraint RLS Algorithm for Sparse System Identification (2312.05829v1)

Published 10 Dec 2023 in cs.IT, eess.SP, and math.IT

Abstract: In this paper, the recursive least squares (RLS) algorithm is considered in the sparse system identification setting. The cost function of RLS algorithm is regularized by a $p$-norm-like ($0 \leq p \leq 1$) constraint of the estimated system parameters. In order to minimize the regularized cost function, we transform it into a penalized maximum likelihood (ML) problem, which is solved by the expectation-maximization (EM) algorithm. With the introduction of a thresholding operator, the update equation of the tap-weight vector is derived. We also exploit the underlying sparsity to implement the proposed algorithm in a low computational complexity fashion. Numerical simulations demonstrate the superiority of the new algorithm over conventional sparse RLS algorithms, as well as regular RLS algorithm.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (18)
  1. W. Bajwa, J. Haupt, G. Raz, and R. Nowak, “Compressed channel sensing,” in Proc. CISS, 2008.
  2. R. Tibshirani, “Regression shinkage and selection via the LASSO,” J. Roy. Statist. Soc., 58(1), 267-288, 1996.
  3. D. Donoho, “Compressed sensing,” IEEE Trans. Inform. Theory, vol. 52, pp. 1289-1306, Apr. 2006.
  4. B. D. Rao and B. Song, “Adaptive filtering algorithms for promoting sparsity,” Proc. ICASSP, vol. 6, pp. VI361-VI364, April 2003.
  5. Y. Chen, Y. Gu, and A. O. Hero, “Sparse LMS for system identification,” ICASSP, 3125-3128, Taiwan, Apr. 2009.
  6. O. Taheri and S. A. Vorobyov, “Sparse channel estimation with ℓpsubscriptℓ𝑝\ell_{p}roman_ℓ start_POSTSUBSCRIPT italic_p end_POSTSUBSCRIPT-norm and reweighted ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-norm penalized least mean squares,” ICASSP, 2011.
  7. Y. Gu, J. Jin, and S. Mei, “l0subscript𝑙0l_{0}italic_l start_POSTSUBSCRIPT 0 end_POSTSUBSCRIPT Norm constraint LMS algorithm for sparse system identification,” IEEE Signal Process. Lett., 16(9), 774-777, Sep. 2009.
  8. F. Wu and F. Tong, “Gradient optimization p-norm-like constraint LMS algorithm for sparse system estimation,” Signal Processing, 93, 967-971, 2013.
  9. M. L. Aliyu, M. A. Alkassim, and M. S. Salman, “A p𝑝pitalic_p-norm variable step-size LMS algorithm for sparse system identification,” Signal, Image and Video Processing, 1-7, 2014.
  10. L. Weruaga and S. Jimaa, “Exact NLMS algorithm with ℓpsubscriptℓ𝑝\ell_{p}roman_ℓ start_POSTSUBSCRIPT italic_p end_POSTSUBSCRIPT-norm constraint,” IEEE Signal Processing Letters, 22(3), 366-370, Mar. 2015.
  11. D. Angelosante, J. A. Bazerque, and G. B. Giannakis, “Online adaptive estimation of sparse signals: where RLS meets the ℓ1subscriptℓ1\ell_{1}roman_ℓ start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-norm,” IEEE Trans. Signal Process., vol. 58, no. 7, pp. 3436-3447, Jul. 2010.
  12. E. M. Eksioglu, “Sparsity regularised recursive least squares adaptive filtering,” IET Signal Process., 5(5), 480-487, 2011.
  13. E. M. Eksioglu and A. K. Tanc, “RLS algorithm with convex regularization,” IEEE Signal Processing Letters, 18(8), 470-473, Aug. 2011.
  14. B. Babadi, N. Kalouptsidis, and V. Tarokh, “SPARLS: The sparse RLS algorithm,” IEEE Trans. Signal Process., 58(8), 4013-4025, Aug. 2010.
  15. M. Figueirado and R. Nowak, “An EM algorithm for wavelet-based image restoration,” IEEE Trans. Image Process., vol. 12, no. 8, pp. 906-916, Aug. 2003.
  16. Z. Liu, Y. Liu, and C. Li, “Distributed sparse recursive least-squares over networks,” IEEE Trans. Signal Process., vol. 62, no. 6, pp. 1386-1395, Mar. 2014.
  17. Y. V. Zakharov and V. H. Nascimento, “DCD-RLS adaptive filters with penalties for sparse identification,” IEEE Trans. Signal Process., vol. 61, no. 12, pp. 3198-3213, Jun. 2013.
  18. X. Hong, J. Gao, and S. Chen, “Zero-attracting recursive least squares algorithms,” IEEE Trans. Vehicular Technology, vol. 66, no. 1, pp. 213-221, Jan. 2017.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube