Emergent Mind

Performance Analysis of l_0 Norm Constraint Least Mean Square Algorithm

(1203.1535)
Published Mar 7, 2012 in cs.IT , cs.PF , and math.IT

Abstract

As one of the recently proposed algorithms for sparse system identification, $l0$ norm constraint Least Mean Square ($l0$-LMS) algorithm modifies the cost function of the traditional method with a penalty of tap-weight sparsity. The performance of $l0$-LMS is quite attractive compared with its various precursors. However, there has been no detailed study of its performance. This paper presents all-around and throughout theoretical performance analysis of $l0$-LMS for white Gaussian input data based on some reasonable assumptions. Expressions for steady-state mean square deviation (MSD) are derived and discussed with respect to algorithm parameters and system sparsity. The parameter selection rule is established for achieving the best performance. Approximated with Taylor series, the instantaneous behavior is also derived. In addition, the relationship between $l0$-LMS and some previous arts and the sufficient conditions for $l0$-LMS to accelerate convergence are set up. Finally, all of the theoretical results are compared with simulations and are shown to agree well in a large range of parameter setting.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.