Emergent Mind

Diffusion LMS Strategies in Sensor Networks with Noisy Input Data

(1507.05154)
Published Jul 18, 2015 in cs.SY and math.OC

Abstract

We investigate the performance of distributed least-mean square (LMS) algorithms for parameter estimation over sensor networks where the regression data of each node are corrupted by white measurement noise. Under this condition, we show that the estimates produced by distributed LMS algorithms will be biased if the regression noise is excluded from consideration. We propose a bias-elimination technique and develop a novel class of diffusion LMS algorithms that can mitigate the effect of regression noise and obtain an unbiased estimate of the unknown parameter vector over the network. In our development, we first assume that the variances of the regression noises are known a-priori. Later, we relax this assumption by estimating these variances in real-time. We analyze the stability and convergence of the proposed algorithms and derive closed-form expressions to characterize their mean-square error performance in transient and steady-state regimes. We further provide computer experiment results that illustrate the efficiency of the proposed algorithms and support the analytical findings.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.