期刊
MATHEMATICS
卷 11, 期 1, 页码 -出版社
MDPI
DOI: 10.3390/math11010156
关键词
Internet of Things; quantum computing; Edge Computing; optimization; fog computing
类别
IoT-Edge-Fog Computing proposes a decentralized computing model for time-sensitive tasks. However, task allocation among dispersed Edge Computing nodes remains a challenge with existing techniques. This study presents a Quantum Computing-inspired optimization technique for efficient task allocation in an Edge Computing environment and employs a QC-Neural Network Model for predicting optimal computing nodes. Simulations with 6, 10, 14, and 20 Edge nodes were conducted, showing a 5.02% improvement in prediction efficiency and a 2.03% error reduction compared to state-of-the-art techniques.
IoT-Edge-Fog Computing presents a trio-logical model for decentralized computing in a time-sensitive manner. However, to address the rising need for real-time information processing and decision modeling, task allocation among dispersed Edge Computing nodes has been a major challenge. State-of-the-art task allocation techniques such as Min-Max, Minimum Completion time, and Round Robin perform task allocation, butv several limitations persist including large energy consumption, delay, and error rate. Henceforth, the current work provides a Quantum Computing-inspired optimization technique for efficient task allocation in an Edge Computing environment for real-time IoT applications. Furthermore, the QC-Neural Network Model is employed for predicting optimal computing nodes for delivering real-time services. To acquire the performance enhancement, simulations were performed by employing 6, 10, 14, and 20 Edge nodes at different times to schedule more than 600 heterogeneous tasks. Empirical results show that an average improvement of 5.02% was registered for prediction efficiency. Similarly, the error reduction of 2.03% was acquired in comparison to state-of-the-art techniques.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据