Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 60 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 18 tok/s Pro
GPT-5 High 14 tok/s Pro
GPT-4o 77 tok/s Pro
Kimi K2 159 tok/s Pro
GPT OSS 120B 456 tok/s Pro
Claude Sonnet 4 38 tok/s Pro
2000 character limit reached

Robust Non-parametric Knowledge-based Diffusion Least Mean Squares over Adaptive Networks (2312.01299v1)

Published 3 Dec 2023 in cs.LG

Abstract: The present study proposes incorporating non-parametric knowledge into the diffusion least-mean-squares algorithm in the framework of a maximum a posteriori (MAP) estimation. The proposed algorithm leads to a robust estimation of an unknown parameter vector in a group of cooperative estimators. Utilizing kernel density estimation and buffering some intermediate estimations, the prior distribution and conditional likelihood of the parameters vector in each node are calculated. Pseudo Huber loss function is used for designing the likelihood function. Also, an error thresholding function is defined to reduce the computational overhead as well as more relaxation against noise, which stops the update every time an error is less than a predefined threshold. The performance of the proposed algorithm is examined in the stationary and non-stationary scenarios in the presence of Gaussian and non-Gaussian noise. Results show the robustness of the proposed algorithm in the presence of different noise types.

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.