Emergent Mind

Abstract

We consider a scenario in which a database stores sensitive data of users and an analyst wants to estimate statistics of the data. The users may suffer a cost when their data are used in which case they should be compensated. The analyst wishes to get an accurate estimate, while the users want to maximize their utility. We want to design a mechanism that can estimate statistics accurately without compromising users' privacy. Since users' costs and sensitive data may be correlated, it is important to protect the privacy of both data and cost. We model this correlation by assuming that a user's unknown sensitive data determines a distribution from a set of publicly known distributions and a user's cost is drawn from that distribution. We propose a stronger model of privacy preserving mechanism where users are compensated whenever they reveal information about their data to the mechanism. In this model, we design a Bayesian incentive compatible and privacy preserving mechanism that guarantees accuracy and protects the privacy of both cost and data.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.