4.7 Article

LeaD: Large-Scale Edge Cache Deployment Based on Spatio-Temporal WiFi Traffic Statistics

Journal

IEEE TRANSACTIONS ON MOBILE COMPUTING
Volume 20, Issue 8, Pages 2607-2623

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/TMC.2020.2984261

Keywords

Wireless fidelity; Cache storage; Lead; Switches; Quality of experience; Mobile computing; Large-scale WiFi system; edge cache deployment; caching gain maximization; Big Data analytics; stationary traffic consumption

Funding

  1. National Key R&D Program of China [2019YFA0706403]
  2. National Natural Science Foundation of China [91638204, 61702562, U19A2067]
  3. 111 project [B18059]
  4. Young Title Scientists Sponsorship Program by CAST [2018QNRC001]
  5. Young Talents Plan of Hunan Province of China [2019RS2001]
  6. Natural Sciences and Engineering Research Council (NSERC) of Canada

Ask authors/readers for more resources

The paper explores cache deployment in a large-scale WiFi system to maximize long-term caching gain, given 8,000 APs and over 40,000 active users, with heterogeneous traffic characteristics. The study shows that cache size should be allocated heterogeneously according to traffic demands, and short-term traffic statistics can be used to infer future traffic conditions. The proposed LeaD strategy achieves near-optimal caching performance and outperforms other benchmark strategies significantly.
Widespread and large-scale WiFi systems have been deployed in many corporate locations, while the backhual capacity becomes the bottleneck in providing high-rate data services to a tremendous number of WiFi users. Mobile edge caching is a promising solution to relieve backhaul pressure and deliver quality services by proactively pushing contents to access points (APs). However, how to deploy cache in large-scale WiFi system is not well studied yet quite challenging since numerous APs can have heterogeneous traffic characteristics, and future traffic conditions are unknown ahead. In this paper, given the cache storage budget, we explore the cache deployment in a large-scale WiFi system, which contains 8,000 APs and serves more than 40,000 active users, to maximize the long-term caching gain. Specifically, we first collect two-month user association records and conduct intensive spatio-temporal analytics on WiFi traffic consumption, gaining two major observations. First, per AP traffic consumption varies in a rather wide range and the proportion of AP distributes evenly within the range, indicating that the cache size should be heterogeneously allocated in accordance to the underlying traffic demands. Second, compared to a single AP, the traffic consumption of a group of APs (clustered by physical locations) is more stable, which means that the short-term traffic statistics can be used to infer the future long-term traffic conditions. We then propose our cache deployment strategy, named LeaD (i.e., Large-scale WiFi Edge cAche Deployment), in which we first cluster large-scale APs into well-sized edge nodes, then conduct the stationary testing on edge level traffic consumption and sample sufficient traffic statistics in order to precisely characterize long-term traffic conditions, and finally devise the TEG (Traffic-wEighted Greedy) algorithm to solve the long-term caching gain maximization problem. Extensive trace-driven experiments are carried out, and the results demonstrate that LeaD is able to achieve the near-optimal caching performance and can outperform other benchmark strategies significantly.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available