4.2 Article

Prefetching and caching for minimizing service costs: Optimal and approximation strategies

期刊

PERFORMANCE EVALUATION
卷 145, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.peva.2020.102149

关键词

Caching; Prefetching; Optimal offline policy

资金

  1. NSF, United States of America [CMMI-SMOR-1562065, CNS-ICN-WEN-1719371, CNS-NeTS-1514260, CNSNeTS-1717045, CNS-NeTS-1717060, CNS-NeTS-2007231, CNS-SpecEES-1824337]
  2. ONR, United States of America [N00014-19-1-2621]
  3. DTRA, United States of America [HDTRA1-18-1-0050]

向作者/读者索取更多资源

The text discusses the strategic prefetching of data to improve caching performance and the design of optimal prefetching and caching policies. Utilizing a cost-based service model and a lightweight approximation policy, a real-time and near-optimal approach is proposed for cache eviction mechanisms. Extensive simulations verify its performance under various popularity distributions.
Strategically prefetching data has been utilized in practice to improve caching performance. Apart from caching data items upon requests, they can be prefetched into the cache before requests actually occur. The caching and prefetching operations compete for the limited cache space, whose size is typically much smaller than the number of data items. A key question is how to design an optimal prefetching and caching policy, assuming that the future requests can be predicted to certain extend. This question is non-trivial even under an idealized assumption that the future requests are precisely known. To investigate this problem, we propose a cost-based service model. The objective is to find the optimal offline prefetching and caching policy that minimizes the accumulated cost for a given request sequence. By casting it as a min-cost flow problem, we are able to find the optimal policy for a data trace of length N in expected time O(N-3/2) via flow-based algorithms. However, this requires the entire trace for each request and cannot be applied in real time. To this end, we analytically characterize the optimal policy by obtaining an optimal cache eviction mechanism. We derive conditions under which proactive prefetching is a better choice than passive caching. Based on these insights, we propose a lightweight approximation policy that only exploits predictions in the near future. Moreover, the approximation policy can be applied in real time and processes the entire trace in O(N) expected time. We prove that the competitive ratio of the approximation policy is less than root 2. Extensive simulations verify its near-optimal performance, for both heavy and light-tailed popularity distributions. (c) 2020 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.2
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据