Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 145 tok/s
Gemini 2.5 Pro 40 tok/s Pro
GPT-5 Medium 22 tok/s Pro
GPT-5 High 23 tok/s Pro
GPT-4o 107 tok/s Pro
Kimi K2 195 tok/s Pro
GPT OSS 120B 446 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

LDP-FPMiner: FP-Tree Based Frequent Itemset Mining with Local Differential Privacy (2209.01333v1)

Published 3 Sep 2022 in cs.DB and cs.CR

Abstract: Data aggregation in the setting of local differential privacy (LDP) guarantees strong privacy by providing plausible deniability of sensitive data. Existing works on this issue mostly focused on discovering heavy hitters, leaving the task of frequent itemset mining (FIM) as an open problem. To the best of our knowledge, the-state-of-the-art LDP solution to FIM is the SVSM protocol proposed recently. The SVSM protocol is mainly based on the padding and sampling based frequency oracle (PSFO) protocol, and regarded an itemset as an independent item without considering the frequency consistency among itemsets. In this paper, we propose a novel LDP approach to FIM called LDP-FPMiner based on frequent pattern tree (FP-tree). Our proposal exploits frequency consistency among itemsets by constructing and optimizing a noisy FP-tree with LDP. Specifically, it works as follows. First, the most frequent items are identified, and the item domain is cut down accordingly. Second, the maximum level of the FP-tree is estimated. Third, a noisy FP-tree is constructed and optimized by using itemset frequency consistency, and then mined to obtain the k most frequent itemsets. Experimental results show that the LDP-FPMiner significantly improves over the state-of-the-art approach, SVSM, especially in the case of a high privacy level.

Citations (2)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.