Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 37 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 10 tok/s Pro
GPT-5 High 15 tok/s Pro
GPT-4o 84 tok/s Pro
Kimi K2 198 tok/s Pro
GPT OSS 120B 448 tok/s Pro
Claude Sonnet 4 31 tok/s Pro
2000 character limit reached

Computation Resource Allocation for Heterogeneous Time-Critical IoT Services in MEC (2002.04851v1)

Published 12 Feb 2020 in cs.NI

Abstract: Mobile edge computing (MEC) is one of the promising solutions to process computational-intensive tasks within short latency for emerging Internet-of-Things (IoT) use cases, e.g., virtual reality (VR), augmented reality (AR), autonomous vehicle. Due to the coexistence of heterogeneous services in MEC system, the task arrival interval and required execution time can vary depending on services. It is challenging to schedule computation resource for the services with stochastic arrivals and runtime at an edge server (ES). In this paper, we propose a flexible computation offloading framework among users and ESs. Based on the framework, we propose a Lyapunov-based algorithm to dynamically allocate computation resource for heterogeneous time-critical services at the ES. The proposed algorithm minimizes the average timeout probability without any prior knowledge on task arrival process and required runtime. The numerical results show that, compared with the standard queuing models used at ES, the proposed algorithm achieves at least 35% reduction of the timeout probability, and approximated utilization efficiency of computation resource to non-cause queuing model under various scenarios.

Citations (3)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)