Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 137 tok/s
Gemini 2.5 Pro 45 tok/s Pro
GPT-5 Medium 26 tok/s Pro
GPT-5 High 24 tok/s Pro
GPT-4o 116 tok/s Pro
Kimi K2 207 tok/s Pro
GPT OSS 120B 430 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

p Norm Constraint Leaky LMS Algorithm for Sparse System Identification (1503.01337v1)

Published 3 Mar 2015 in cs.SY

Abstract: This paper proposes a new leaky least mean square (leaky LMS, LLMs) algorithm in which a norm penalty is introduced to force the solution to be sparse in the application of system identification. The leaky LMS algorithm is derived because the performance ofthe standard LMS algorithm deteriorates when the input is highly correlated. However, both ofthem do not take the sparsity information into account to yield better behaviors. As a modification ofthe LLMs algorithm, the proposed algorithm, named Lp-LLMs, incorporates a p norm penalty into the cost function ofthe LLMs to obtain a shrinkage in the weight update equation, which then enhances the performance of the filter in system identification settings, especially when the impulse response is sparse. The simulation results verify that the proposed algorithm improves the performance ofthe filter in sparse system settings in the presence ofnoisy input signals.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.