Emergent Mind

Energy Efficient Placement of ML-Based Services in IoT Networks

(2203.12312)
Published Mar 23, 2022 in eess.SY and cs.SY

Abstract

The Internet of Things (IoT) is gaining momentum in its quest to bridge the gap between the physical and the digital world. The main goal of the IoT is the creation of smart environments and self-aware things that help to facilitate a variety of services such as smart transport, climate monitoring, e-health, etc. Huge volumes of data are expected to be collected by the connected sensors/things, which in traditional cases are processed centrally by large data centers in the core network that will inevitably lead to excessive transportation power consumption as well as added latency overheads. Instead, fog computing has been proposed by researchers from industry and academia to extend the capability of the cloud right to the point where the data is collected at the sensing layer. This way, primitive tasks that can be hosted in IoT sensors do not need to be sent all the way to the cloud for processing. In this paper we propose energy efficient embedding of ML models over a cloud-fog network using a Mixed Integer Linear Programming (MILP) optimization model. We exploit virtualization in our framework to provide service abstraction of Deep Neural Networks (DNN) layers that can be composed into a set of VMs interconnected by virtual links. We constrain the number of VMs that can be processed at the IoT layer and study the impact on the performance of the cloud fog approach.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.