4.6 Article

Technical Study of Deep Learning in Cloud Computing for Accurate Workload Prediction

Journal

ELECTRONICS
Volume 12, Issue 3, Pages -

Publisher

MDPI
DOI: 10.3390/electronics12030650

Keywords

deep learning; workload prediction; cloud computing; machine learning

Ask authors/readers for more resources

Proactive resource management in Cloud Services is important for cost effectiveness and addressing issues such as SLA violations and resource provisioning. Workload prediction using Deep Learning (DL) is popular for analyzing cloud environment data, but the quality of the training data influences the model's performance. Existing works in this domain often lack uniformity in data sources, leading to decreased efficacy of DL models. In this study, DL models are used to analyze real-world workloads from SWF, and the LSTM model exhibits the best performance. The paper also addresses the lack of literature on DL in workload prediction in cloud computing environments.
Proactive resource management in Cloud Services not only maximizes cost effectiveness but also enables issues such as Service Level Agreement (SLA) violations and the provisioning of resources to be overcome. Workload prediction using Deep Learning (DL) is a popular method of inferring complicated multidimensional data of cloud environments to meet this requirement. The overall quality of the model depends on the quality of the data as much as the architecture. Therefore, the data sourced to train the model must be of good quality. However, existing works in this domain have either used a singular data source or have not taken into account the importance of uniformity for unbiased and accurate analysis. This results in the efficacy of DL models suffering. In this paper, we provide a technical analysis of using DL models such as Recurrent Neural Networks (RNN), Multilayer Perception (MLP), Long Short-Term Memory (LSTM), and, Convolutional Neural Networks (CNN) to exploit the time series characteristics of real-world workloads from the Parallel Workloads Archive of the Standard Workload Format (SWF) with the aim of conducting an unbiased analysis. The robustness of these models is evaluated using the Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE) error metrics. The findings of these highlight that the LSTM model exhibits the best performance compared to the other models. Additionally, to the best of our knowledge, insights of DL in workload prediction of cloud computing environments is insufficient in the literature. To address these challenges, we provide a comprehensive background on resource management and load prediction using DL. Then, we break down the models, error metrics, and data sources across different bodies of work.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available