Emergent Mind

Multi-access Coded Caching with Decentralized Prefetching

(2203.16845)
Published Mar 31, 2022 in cs.IT and math.IT

Abstract

An extension of coded caching referred to as multi-access coded caching where each user can access multiple caches and each cache can serve multiple users is considered in this paper. Most of the literature in multi-access coded caching focuses on cyclic wrap-around cache access where each user is allowed to access an exclusive set of consecutive caches only. In this paper, a more general framework of multi-access caching problem is considered in which each user is allowed to randomly connect to a specific number of caches and multiple users can access the same set of caches. For the proposed system model considering decentralized prefetching, a new delivery scheme is proposed and an expression for per user delivery rate is obtained. A lower bound on the delivery rate is derived using techniques from index coding. The proposed scheme is shown to be optimal among all the linear schemes under certain conditions. An improved delivery rate and a lower bound for the decentralized multi-access coded caching scheme with cyclic wrap-around cache access can be obtained as a special case. By giving specific values to certain parameters, the results of decentralized shared caching scheme and of conventional decentralized caching scheme can be recovered.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.