Emergent Mind

Abstract

Existing studies on differential privacy mainly consider aggregation on data sets where each entry corresponds to a particular participant to be protected. In many situations, a user may pose a relational algebra query on a sensitive database, and desires differentially private aggregation on the result of the query. However, no known work is capable to release this kind of aggregation when the query contains unrestricted join operations. This severely limits the applications of existing differential privacy techniques because many data analysis tasks require unrestricted joins. One example is subgraph counting on a graph. Existing methods for differentially private subgraph counting address only edge differential privacy and are subject to very simple subgraphs. Before this work, whether any nontrivial graph statistics can be released with reasonable accuracy under node differential privacy is still an open problem. In this paper, we propose a novel differentially private mechanism to release an approximation to a linear statistic of the result of some positive relational algebra calculation over a sensitive database. Unrestricted joins are supported in our mechanism. The error bound of the approximate answer is roughly proportional to the \emph{empirical sensitivity} of the query a new notion that measures the maximum possible change to the query answer when a participant withdraws its data from the sensitive database. For subgraph counting, our mechanism provides the first solution to achieve node differential privacy, for any kind of subgraphs.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.