Emergent Mind

Abstract

Supporting convolutional neural network (CNN) inference on resource-constrained IoT devices in a timely manner has been an outstanding challenge for emerging smart systems. To mitigate the burden on IoT devices, the prevailing solution is to offload the CNN inference task, which is usually composed of billions of operations, to public cloud. However, the "offloading-to-cloud" solution may cause privacy breach while moving sensitive data to cloud. For privacy protection, the research community has resorted to advanced cryptographic primitives and approximation techniques to support CNN inference on encrypted data. Consequently, these attempts cause impractical computational overhead on IoT devices and degrade the performance of CNNs. Moreover, relying on the remote cloud can cause additional network latency and even make the system dysfunction when network connection is off. We proposes an extremely lightweight edge device assisted private CNN inference solution for IoT devices, namely LEP-CNN. The main design of LEP-CNN is based on a novel online/offline encryption scheme. The decryption of LEP-CNN is pre-computed offline via utilizing the linear property of the most time-consuming operations of CNNs. As a result, LEP-CNN allows IoT devices to securely offload over 99% CNN operations, and edge devices to execute CNN inference on encrypted data as efficient as on plaintext. LEP-CNN also provides an integrity check option to help IoT devices detect error results with a successful rate over 99%. Experiments on AlexNet show that LEP-CNN can speed up the CNN inference for more than 35 times for resource constrained IoT devices. A homomorphic encryption based AlexNet using CryptoNets is implemented to compare with LEP-CNN to demonstrate that LEP-CNN has a better performance than homomorphic encryption based privacy preserving neural networks under time-sensitive scenarios.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.