期刊
IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY
卷 68, 期 2, 页码 1930-1941出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TVT.2018.2890685
关键词
Mobile edge computing; energy harvesting; reinforcement learning; Internet of Things; computation offloading
资金
- National Natural Science Foundation of China [61671396, 91638204, 61761136012, 61533015, 61572538]
- National Mobile Communications Research Laboratory, Southeast University [2018D08]
- Fundamental Research Funds for the Central Universities [17LGJC23]
- Guangdong Special Support Program [2017TX04X148]
Internet of Things (IoT) devices can apply mobile edge computing (MEC) and energy harvesting (EH) to provide high-level experiences for computational intensive applications and concurrently to prolong the lifetime of the battery. In this paper, we propose a reinforcement learning (RL) based offloading scheme for an IoT device with EH to select the edge device and the offloading rate according to the current battery level, the previous radio transmission rate to each edge device, and the predicted amount of the harvested energy. This scheme enables the IoT device to optimize the offloading policy without knowledge of the MEC model, the energy consumption model, and the computation latency model. Further, we present a deep RL-based offloading scheme to further accelerate the learning speed. Their performance bounds in terms of the energy consumption, computation latency, and utility are provided for three typical offloading scenarios and verified via simulations for an IoT device that uses wireless power transfer for energy harvesting. Simulation results show that the proposed RL-based offloading scheme reduces the energy consumption, computation latency, and task drop rate, and thus increases the utility of the IoT device in the dynamic MEC in comparison with the benchmark offloading schemes.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据