Emergent Mind

Locally Differentially Private Data Collection and Analysis

(1906.01777)
Published Jun 5, 2019 in cs.CR

Abstract

Local differential privacy (LDP) can provide each user with strong privacy guarantees under untrusted data curators while ensuring accurate statistics derived from privatized data. Due to its powerfulness, LDP has been widely adopted to protect privacy in various tasks (e.g., heavy hitters discovery, probability estimation) and systems (e.g., Google Chrome, Apple iOS). Although ${\epsilon}$-LDP has been proposed for many years, the more general notion of $({\epsilon}, {\delta})$-LDP has only been studied in very few papers, which mainly consider mean estimation for numeric data. Besides, prior solutions achieve $({\epsilon}, {\delta})$-LDP by leveraging Gaussian mechanism, which leads to low accuracy of the aggregated results. In this paper, we propose novel mechanisms that achieve $({\epsilon}, {\delta})$-LDP with high utility in data analytics and machine learning. Specifically, we first design $({\epsilon}, {\delta})$-LDP algorithms for collecting multi-dimensional numeric data, which can ensure higher accuracy than the optimal Gaussian mechanism while guaranteeing strong privacy for each user. Then, we investigate different local protocols for categorical attributes under $({\epsilon}, {\delta})$-LDP. Furthermore, we conduct theoretical analysis on the error bound and variance of the proposed algorithms. Experimental results on real and synthetic datasets demonstrate the high data utility of our proposed algorithms on both simple data statistics and complex machine learning models.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.