4.6 Article

Decentralized Coded Caching for Shared Caches

期刊

IEEE COMMUNICATIONS LETTERS
卷 25, 期 5, 页码 1458-1462

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/LCOMM.2021.3052237

关键词

Prefetching; Servers; Indexes; Encoding; Cache memory; Receivers; Loading; Coded caching; decentralized caching; shared caching; index coding

资金

  1. Science and Engineering Research Board (SERB), Government of India, through its Start-up Research Grant (SRG) [SRG/2020/000239]

向作者/读者索取更多资源

This study focuses on addressing network congestion caused by temporal variance in client demands in the client-server framework, and proposes a decentralized shared caching scheme. The proposed scheme, utilizing index coding techniques, is shown to be optimal among all linear schemes, achieving a comparable rate to existing centralized prefetching schemes.
The demands of the clients in the client-server framework exhibit temporal variance leading to congestion in the network at random intervals. To alleviate this problem, popular data is loaded in cache memories scattered across the network. In the conventional cache framework, each user has an associated cache and cache loading is centrally coordinated. For large networks, a more practical approach is to make the loading of the caches decentralized. This letter considers the shared caching problem in which each cache can serve multiple clients. A new and optimal delivery scheme is proposed for the decentralized shared caching problem. The delivery scheme is shown to be optimal among all linear schemes, using techniques from index coding. It is shown that the rate achieved by the proposed scheme is comparable to the existing scheme which uses centralized prefetching.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据