Emergent Mind

Abstract

Recently, implementing Radio Access Network (RAN) functionalities on cloud-based computing platform has become an emerging solution that leverages the many advantages of cloud infrastructure, such as shared computing resources and storage capacity, while lowering the operational cost. In this paper, we propose a novel caching framework aimed at fully exploiting the potential of such Cloud-based RAN (C-RAN) systems through cooperative hierarchical caching which minimizes the network costs of content delivery and improves users' Quality of Experience (QoE). In particular, we consider the cloud-cache in the cloud processing unit (CPU) as a new layer in the RAN cache hierarchy, bridging the capacity-performance gap between the traditional edge-based and core-based caching schemes. A delay cost model is introduced to characterize and formulate the cache placement optimization problem, which is shown to be NP-complete. As such, a low complexity, heuristic cache management strategy is proposed, constituting of a proactive cache distribution algorithm and a reactive cache replacement algorithm. Extensive numerical simulations are carried out using both real-world YouTube video requests and synthetic content requests. It is demonstrated that our proposed Octopus caching strategy significantly outperforms the traditional caching strategies in terms of cache hit ratio, average content access delay and backhaul traffic load.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.