期刊
IEEE TRANSACTIONS ON COGNITIVE COMMUNICATIONS AND NETWORKING
卷 8, 期 2, 页码 1239-1252出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCCN.2021.3130995
关键词
Actor-critic algorithm; branching neural network; reinforcement learning; mobile edge caching
资金
- National Key R&D Program of China [2020YFB1807601]
- Innovation Project of Guangdong Educational Department [2019KTSCX147]
- Shenzhen Science and Technology Program [JCYJ20210324095209025]
- ZTE Industry-Academia-Research Cooperation Funds
This research proposes an actor-critic reinforcement learning based proactive caching policy for mobile edge networks, which can minimize caching cost and expected downloading delay without prior knowledge of users' content demand. Numerical results show that the algorithm can significantly reduce total cost and average downloading delay.
Mobile edge caching/computing (MEC) has emerged as a promising approach for addressing the drastic increasing mobile data traffic by bringing high caching and computing capabilities to the edge of networks. Under MEC architecture, content providers (CPs) are allowed to lease some virtual machines (VMs) at MEC servers to proactively cache popular contents for improving users' quality of experience. The scalable cache resource model rises the challenge for determining the ideal number of leased VMs for CPs to obtain the minimum expected downloading delay of users at the lowest caching cost. To address these challenges, in this paper, we propose an actor-critic (AC) reinforcement learning based proactive caching policy for mobile edge networks without the prior knowledge of users' content demand. Specifically, we formulate the proactive caching problem under dynamical users' content demand as a Markov decision process and propose a AC based caching algorithm to minimize the caching cost and the expected downloading delay. Particularly, to reduce the computational complexity, a branching neural network is employed to approximate the policy function in the actor part. Numerical results show that the proposed caching algorithm can significantly reduce the total cost and the average downloading delay when compared with other popular algorithms.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据