4.4 Article

Proactive edge computing in fog networks with latency and reliability guarantees

出版社

SPRINGER
DOI: 10.1186/s13638-018-1218-y

关键词

5G; Caching; Fog networks; IoT; Hedged requests; Matching theory; Offloading; Resource allocation

资金

  1. Academy of Finland (CARMA) project
  2. US Office of Naval Research (ONR) [N00014-15-1-2709]
  3. NOKIA donation on fog (FOGGY project)
  4. National Research Foundation of Korea [21A20131612192] Funding Source: Korea Institute of Science & Technology Information (KISTI), National Science & Technology Information Service (NTIS)

向作者/读者索取更多资源

This paper studies the problem of task distribution and proactive edge caching in fog networks with latency and reliability constraints. In the proposed approach, user nodes (UNs) offload their computing tasks to edge computing servers (cloudlets). Cloudlets leverage their computing and storage capabilities to proactively compute and store cacheable computing results. In this regard, a task popularity estimation and caching policy schemes are proposed. Furthermore, the problem of UNs' tasks distribution to cloudlets is modeled as a one-to-one matching game. In this game, UNs whose requests exceed a delay threshold use the notion of hedged-requests to enqueue their request in another cloudlet, and offload the task data to whichever is available first. A matching algorithm based on the deferred-acceptance matching is used to solve this game. Simulation results show that the proposed approach guarantees reliable service and minimal latency, reaching up to 50 and 65% reduction in the average delay and the 99th percentile delay, as compared to reactive baseline schemes.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.4
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据