4.7 Article

Coded Caching for Heterogeneous Systems: An Optimization Perspective

Journal

IEEE TRANSACTIONS ON COMMUNICATIONS
Volume 67, Issue 8, Pages 5321-5335

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCOMM.2019.2914393

Keywords

Coded caching; uncoded placement; cache size optimization; multicast networks

Funding

  1. NSF [1526165, 1749665]
  2. Direct For Computer & Info Scie & Enginr
  3. Division Of Computer and Network Systems [1526165] Funding Source: National Science Foundation
  4. Division of Computing and Communication Foundations
  5. Direct For Computer & Info Scie & Enginr [1749665] Funding Source: National Science Foundation

Ask authors/readers for more resources

In cache-aided networks, the server populates the cache memories at the users during low-traffic periods in order to reduce the delivery load during peak-traffic hours. In turn, there exists a fundamental tradeoff between the delivery load on the server and the cache sizes at the users. In this paper, we study this tradeoff in a multicast network, where the server is connected to users with unequal cache sizes and the number of users is less than or equal to the number of library files. We propose centralized uncoded placement and linear delivery schemes which are optimized by solving a linear program. Additionally, we derive a lower bound on the delivery memory tradeoff with uncoded placement that accounts for the heterogeneity in cache sizes. We explicitly characterize this tradeoff for the case of three end-users, as well as an arbitrary number of end-users when the total memory size at the users is small, and when it is large. Next, we consider a system where the server is connected to the users via rate-limited links of different capacities and the server assigns the users' cache sizes subject to a total cache budget. We characterize the optimal cache sizes that minimize the delivery completion time with uncoded placement and linear delivery. In particular, the optimal memory allocation balances between assigning larger cache sizes to users with low capacity links and uniform memory allocation.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available