Emergent Mind

Online Caching and Coding at the WiFi Edge: Gains and Tradeoffs

(2001.07334)
Published Jan 21, 2020 in cs.NI

Abstract

Video content delivery at the wireless edge continues to be challenged by insufficient bandwidth and highly dynamic user behavior which affects both effective throughput and latency. Caching at the network edge and coded transmissions have been found to improve user performance of video content delivery. The cache at the wireless edge stations (BSs, APs) and at the users' end devices can be populated by pre-caching content or by using online caching policies. In this paper, we propose a system where content is cached at the user of a WiFi network via online caching policies, and coded delivery is employed by the WiFi AP to deliver the requested content to the user population. The content of the cache at the user serves as side information for index coding. We also propose the LFU-Index cache replacement policy at the user that demonstrably improves index coding opportunities at the WiFi AP for the proposed system. Through an extensive simulation study, we determine the gains achieved by caching and index by coding. Next, we analyze the tradeoffs between them in terms of data transmitted, latency, and throughput for different content request behaviors from the users. We also show that the proposed cache replacement policy performs better than traditional cache replacement policies like LRU and LFU.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.