3.8 Proceedings Paper

HybridCache: AI-Assisted Cloud-RAN Caching with Reduced In-Network Content Redundancy

出版社

IEEE
DOI: 10.1109/GLOBECOM42002.2020.9322595

关键词

C-RAN; caching; clustering; LSTM; prediction

资金

  1. Cisco Systems

向作者/读者索取更多资源

The ever-increasing growth of urban populations coupled with recent mobile data usage trends has led to an unprecedented increase in wireless devices, services and applications, with varying quality of service needs in terms of latency, data rate, and connectivity. To cope with these rising demands and challenges, next-generation wireless networks have resorted to cloud radio access network (Cloud-RAM technology as a way of reducing latency and network traffic. A concrete example of this is New York City's LinkNYC network infrastructure, which replaces the city's payphones with kiosk-like structures, called Links, le provide fast and free public Wi-Fi access to city users. When enabled with data storage capability, these Links can, for example, play the role of edge cloud devices to allow in-network content caching so that access latency and network traffic are reduced. In this paper, we propose HybridCache, a hybrid proactive and reactive in-network caching scheme that reduces content access latency and network traffic congestion substantially. It does so by first grouping edge cloud devices in clusters to minimize intra-duster content access latency and then enabling cooperative-proactively and reactively-caching using LSTM-based prediction to minimize in-network content redundancy. Using the LinkNYC network as the backbone infrastructure for evaluation, we show that HybridCache reduces the number of hops that content needs to traverse and increases cache hit rates, thereby reducing both network traffic and content access latency.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据