Emergent Mind

Bayesian Differential Privacy for Linear Dynamical Systems

(2106.12749)
Published Jun 24, 2021 in math.OC , cs.CR , cs.SY , and eess.SY

Abstract

Differential privacy is a privacy measure based on the difficulty of discriminating between similar input data. In differential privacy analysis, similar data usually implies that their distance does not exceed a predetermined threshold. It, consequently, does not take into account the difficulty of distinguishing data sets that are far apart, which often contain highly private information. This problem has been pointed out in the research on differential privacy for static data, and Bayesian differential privacy has been proposed, which provides a privacy protection level even for outlier data by utilizing the prior distribution of the data. In this study, we introduce this Bayesian differential privacy to dynamical systems, and provide privacy guarantees for distant input data pairs and reveal its fundamental property. For example, we design a mechanism that satisfies the desired level of privacy protection, which characterizes the trade-off between privacy and information utility.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.