Emergent Mind

Abstract

Privacy-preserving average consensus aims to guarantee the privacy of initial states and asymptotic consensus on the exact average of the initial value. In existing work, it is achieved by adding and subtracting variance decaying and zero-sum random noises to the consensus process. However, there is lack of theoretical analysis to quantify the degree of the privacy protection. In this paper, we introduce the maximum disclosure probability that the other nodes can infer one node's initial state within a given small interval to quantify the privacy. We develop a novel privacy definition, named $(\epsilon, \delta)$-data-privacy, to depict the relationship between maximum disclosure probability and estimation accuracy. Then, we prove that the general privacy-preserving average consensus (GPAC) provides $(\epsilon, \delta)$-data-privacy, and provide the closed-form expression of the relationship between $\epsilon$ and $\delta$. Meanwhile, it is shown that the added noise with uniform distribution is optimal in terms of achieving the highest $(\epsilon, \delta)$-data-privacy. We also prove that when all information used in the consensus process is available, the privacy will be compromised. Finally, an optimal privacy-preserving average consensus (OPAC) algorithm is proposed to achieve the highest $(\epsilon, \delta)$-data-privacy and avoid the privacy compromission. Simulations are conducted to verify the results.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.