Emergent Mind

Abstract

Coded caching (CC) schemes exploit the cumulative cache memory of the users and simple linear coding to turn unicast traffic (individual file requests) into a multicast transmission. For the originally proposed $K$-user single-server/single shared link network model, CC yields an $O(K)$ gain with respect to conventional uncoded caching with the same per-user memory. While several information-theoretic optimality results for a variety of problems and carefully crafted network topologies have been proved, the gains and suitability of CC for practical scenarios such as content streaming over existing wireless networks have not yet been fully demonstrated. In this work, we consider CC for on-demand video streaming over WLANs where multiple users are served simultaneously by multiple spatially distributed access points (AP). Users sequentially request video ``chunks". The CC scheme operates above the IP layer, leaving the underlying standard physical layer and MAC layer untouched. The cache placement is completely asynchronous and decentralized, and the users are placed at random over the network coverage area. For such a system, we consider the region of achievable long-term average delivery rate (defined as the number of video chunks delivered per unit of time) and study the per-user rate distribution under proportional fairness scheduling. We also consider reduced complexity scheduling strategies and compare them with standard state-of-the-art techniques such as conventional (uncoded) caching and collision avoidance by allocating APs on different sub-channels (i.e., frequency reuse).

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.