Emergent Mind

PrivFT: Private and Fast Text Classification with Homomorphic Encryption

(1908.06972)
Published Aug 19, 2019 in cs.CR and cs.LG

Abstract

The need for privacy-preserving analytics is higher than ever due to the severity of privacy risks and to comply with new privacy regulations leading to an amplified interest in privacy-preserving techniques that try to balance between privacy and utility. In this work, we present an efficient method for Text Classification while preserving the privacy of the content using Fully Homomorphic Encryption (FHE). Our system (named \textbf{Priv}ate \textbf{F}ast \textbf{T}ext (PrivFT)) performs two tasks: 1) making inference of encrypted user inputs using a plaintext model and 2) training an effective model using an encrypted dataset. For inference, we train a supervised model and outline a system for homomorphic inference on encrypted user inputs with zero loss to prediction accuracy. In the second part, we show how to train a model using fully encrypted data to generate an encrypted model. We provide a GPU implementation of the Cheon-Kim-Kim-Song (CKKS) FHE scheme and compare it with existing CPU implementations to achieve 1 to 2 orders of magnitude speedup at various parameter settings. We implement PrivFT in GPUs to achieve a run time per inference of less than 0.66 seconds. Training on a relatively large encrypted dataset is more computationally intensive requiring 5.04 days.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.