Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Energy-efficient Task Offloading for Relay Aided Mobile Edge Computing under Sequential Task Dependency (2103.03708v2)

Published 5 Mar 2021 in cs.IT and math.IT

Abstract: In this paper, we study a mobile edge computing (MEC) system in which the mobile device is assisted by a base station (BS) and a cooperative node. The mobile device has sequential tasks to complete, whereas the cooperative node assists the mobile device on both task offloading and task computation. In specific, two cases are investigated, which are 1) the cooperative node has no tasks to complete itself, and 2) the cooperative node has tasks to complete itself. Our target is to minimize the total energy consumption of the mobile device and the cooperative node through optimizing the transmit duration in task offloading, CPU frequency in task computing along with the task index to offload in the sequential tasks. In the first case, we decompose the mixed-integer non-convex problem into two levels. In the lower level problem, thanks to the convexity, Karush-Kuhn-Tucker (KKT) conditions are utilized to simplify the problem, which is then solved with bisection search. In the upper level problem, to find solution of the task index to offload, rather than utilizing traversal method, we develop a monotonic condition to simplify the searching process. In the second case, in order to guarantee the successful computation of the mobile device and cooperative node, the uploading transmission is classified into three schemes. Within each scheme, the non-convex problem is decomposed. In the lower level problem, semi-closed solution is found by Lagrangian dual method. In the upper level problem, traversal method is applied to find the optimal offloading index.

Summary

We haven't generated a summary for this paper yet.