Emergent Mind

Offset-Symmetric Gaussians for Differential Privacy

(2110.06412)
Published Oct 13, 2021 in cs.CR , cs.IT , and math.IT

Abstract

The Gaussian distribution is widely used in mechanism design for differential privacy (DP). Thanks to its sub-Gaussian tail, it significantly reduces the chance of outliers when responding to queries. However, it can only provide approximate $(\epsilon, \delta(\epsilon))$-DP. In practice, $\delta(\epsilon)$ must be much smaller than the size of the dataset, which may limit the use of the Gaussian mechanism for large datasets with strong privacy requirements. In this paper, we introduce and analyze a new distribution for use in DP that is based on the Gaussian distribution, but has improved privacy performance. The so-called offset-symmetric Gaussian tail (OSGT) distribution is obtained through using the normalized tails of two symmetric Gaussians around zero. Consequently, it can still have sub-Gaussian tail and lend itself to analytical derivations. We analytically derive the variance of the OSGT random variable and the $\delta(\epsilon)$ of the OSGT mechanism. We then numerically show that at the same variance, the OSGT mechanism can offer a lower $\delta(\epsilon)$ than the Gaussian mechanism. We extend the OSGT mechanism to $k$-dimensional queries and derive an easy-to-compute analytical upper bound for its zero-concentrated differential privacy (zCDP) performance. We analytically prove that at the same variance, the same global query sensitivity and for sufficiently large concentration orders $\alpha$, the OSGT mechanism performs better than the Gaussian mechanism in terms of zCDP.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.