Emergent Mind

Matching Anonymized and Obfuscated Time Series to Users' Profiles

(1710.00197)
Published Sep 30, 2017 in cs.IT , cs.CR , and math.IT

Abstract

Many popular applications use traces of user data to offer various services to their users. However, even if user data is anonymized and obfuscated, a user's privacy can be compromised through the use of statistical matching techniques that match a user trace to prior user behavior. In this work, we derive the theoretical bounds on the privacy of users in such a scenario. We build on our recent study in the area of location privacy, in which we introduced formal notions of location privacy for anonymization-based location privacy-protection mechanisms. Here we derive the fundamental limits of user privacy when both anonymization and obfuscation-based protection mechanisms are applied to users' time series of data. We investigate the impact of such mechanisms on the trade-off between privacy protection and user utility. We first study achievability results for the case where the time-series of users are governed by an i.i.d. process. The converse results are proved both for the i.i.d. case as well as the more general Markov chain model. We demonstrate that as the number of users in the network grows, the obfuscation-anonymization plane can be divided into two regions: in the first region, all users have perfect privacy; and, in the second region, no user has privacy.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.