Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Recursive Mechanism: Towards Node Differential Privacy and Unrestricted Joins [Full Version, Draft 0.1] (1304.4795v1)

Published 17 Apr 2013 in cs.DB

Abstract: Existing studies on differential privacy mainly consider aggregation on data sets where each entry corresponds to a particular participant to be protected. In many situations, a user may pose a relational algebra query on a sensitive database, and desires differentially private aggregation on the result of the query. However, no known work is capable to release this kind of aggregation when the query contains unrestricted join operations. This severely limits the applications of existing differential privacy techniques because many data analysis tasks require unrestricted joins. One example is subgraph counting on a graph. Existing methods for differentially private subgraph counting address only edge differential privacy and are subject to very simple subgraphs. Before this work, whether any nontrivial graph statistics can be released with reasonable accuracy under node differential privacy is still an open problem. In this paper, we propose a novel differentially private mechanism to release an approximation to a linear statistic of the result of some positive relational algebra calculation over a sensitive database. Unrestricted joins are supported in our mechanism. The error bound of the approximate answer is roughly proportional to the \emph{empirical sensitivity} of the query --- a new notion that measures the maximum possible change to the query answer when a participant withdraws its data from the sensitive database. For subgraph counting, our mechanism provides the first solution to achieve node differential privacy, for any kind of subgraphs.

Citations (128)

Summary

We haven't generated a summary for this paper yet.