Emergent Mind

SliceOps: Explainable MLOps for Streamlined Automation-Native 6G Networks

(2307.01658)
Published Jul 4, 2023 in cs.NI and eess.SP

Abstract

Sixth-generation (6G) network slicing is the backbone of future communications systems. It inaugurates the era of extreme ultra-reliable and low-latency communication (xURLLC) and pervades the digitalization of the various vertical immersive use cases. Since 6G inherently underpins AI, we propose a systematic and standalone slice termed SliceOps that is natively embedded in the 6G architecture, which gathers and manages the whole AI lifecycle through monitoring, re-training, and deploying the ML models as a service for the 6G slices. By leveraging machine learning operations (MLOps) in conjunction with eXplainable AI (XAI), SliceOps strives to cope with the opaqueness of black-box AI using explanation-guided reinforcement learning (XRL) to fulfill transparency, trustworthiness, and interpretability in the network slicing ecosystem. This article starts by elaborating on the architectural and algorithmic aspects of SliceOps. Then, the deployed cloud-native SliceOps working is exemplified via a latency-aware resource allocation problem. The deep RL (DRL)-based SliceOps agents within slices provide AI services aiming to allocate optimal radio resources and impede service quality degradation. Simulation results demonstrate the effectiveness of SliceOps-driven slicing. The article discusses afterward the SliceOps challenges and limitations. Finally, the key open research directions corresponding to the proposed approach are identified.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.