Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 77 tok/s
Gemini 2.5 Pro 33 tok/s Pro
GPT-5 Medium 25 tok/s Pro
GPT-5 High 27 tok/s Pro
GPT-4o 75 tok/s Pro
Kimi K2 220 tok/s Pro
GPT OSS 120B 465 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

Distributed Bayesian Quickest Change Detection in Sensor Networks via Two-layer Large Deviation Analysis (1512.02319v1)

Published 8 Dec 2015 in cs.IT and math.IT

Abstract: We propose a distributed Bayesian quickest change detection algorithm for sensor networks, based on a random gossip inter-sensor communication structure. Without a control or fusion center, each sensor executes its local change detection procedure in a parallel and distributed fashion, interacting with its neighbor sensors via random inter-sensor communications to propagate information. By modeling the information propagation dynamics in the network as a Markov process, two-layer large deviation analysis is presented to analyze the performance of the proposed algorithm. The first-layer analysis shows that the relation between the probability of false alarm and the conditional averaged detection delay satisfies the large deviation principle, implying that the probability of false alarm according to a rare event decays to zero at an exponentially fast rate when the conditional averaged detection decay increases, where the Kullback-Leibler information number is established as a crucial factor. The second-layer analysis shows that the probability of the rare event that not all observations are available at a sensor decays to zero at an exponentially fast rate when the averaged number of communications increases, where the large deviation upper and lower bounds for this rate are also derived, based on which we show that the performance of the distributed algorithm converges exponentially fast to that of the centralized one, by proving that the defined distributed Kullback-Leibler information number converges to the centralized Kullback-Leibler information number.

Citations (9)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.