Papers
Topics
Authors
Recent
2000 character limit reached

Secure Sum Outperforms Homomorphic Encryption in (Current) Collaborative Deep Learning (2006.02894v2)

Published 2 Jun 2020 in cs.CR, cs.LG, and stat.ML

Abstract: Deep learning (DL) approaches are achieving extraordinary results in a wide range of domains, but often require a massive collection of private data. Hence, methods for training neural networks on the joint data of different data owners, that keep each party's input confidential, are called for. We address a specific setting in federated learning, namely that of deep learning from horizontally distributed data with a limited number of parties, where their vulnerable intermediate results have to be processed in a privacy-preserving manner. This setting can be found in medical and healthcare as well as industrial applications. The predominant scheme for this is based on homomorphic encryption (HE), and it is widely considered to be without alternative. In contrast to this, we demonstrate that a carefully chosen, less complex and computationally less expensive secure sum protocol in conjunction with default secure channels exhibits superior properties in terms of both collusion-resistance and runtime. Finally, we discuss several open research questions in the context of collaborative DL, especially regarding privacy risks caused by joint intermediate results.

Citations (7)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.