4.7 Article Proceedings Paper

Echo State Networks for Proactive Caching in Cloud-Based Radio Access Networks With Mobile Users

Journal

IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS
Volume 16, Issue 6, Pages 3520-3535

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TWC.2017.2683482

Keywords

CRAN; mobility; caching; echo state networks; machine learning

Funding

  1. National Natural Science Foundation of China [61671086, 61629101]
  2. ERC Starting through MORE (Advanced Mathematical Tools for Complex Network Engineering) [305123]
  3. U.S. National Science Foundation [IIS-1633363, CNS-1460316, CNS-1513697]
  4. Direct For Computer & Info Scie & Enginr
  5. Division Of Computer and Network Systems [1460316] Funding Source: National Science Foundation

Ask authors/readers for more resources

In this paper, the problem of proactive caching is studied for cloud radio access networks (CRANs). In the studied model, the baseband units (BBUs) can predict the content request distribution and mobility pattern of each user and determine which content to cache at remote radio heads and the BBUs. This problem is formulated as an optimization problem, which jointly incorporates backhaul and fronthaul loads and content caching. To solve this problem, an algorithm that combines the machine learning framework of echo state networks (ESNs) with sublinear algorithms is proposed. Using ESNs, the BBUs can predict each user's content request distribution and mobility pattern while having only limited information on the network's and user's state. In order to predict each user's periodic mobility pattern with minimal complexity, the memory capacity of the corresponding ESN is derived for a periodic input. This memory capacity is shown to capture the maximum amount of user information needed for the proposed ESN model. Then, a sublinear algorithm is proposed to determine which content to cache while using limited content request distribution samples. Simulation results using real data from Youku and the Beijing University of Posts and Telecommunications show that the proposed approach yields significant gains, in terms of sum effective capacity, that reach up to 27.8% and 30.7%, respectively, compared with two baseline algorithms: random caching with clustering and random caching without clustering.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available