4.5 Article

E2LG: a multiscale ensemble of LSTM/GAN deep learning architecture for multistep-ahead cloud workload prediction

期刊

JOURNAL OF SUPERCOMPUTING
卷 77, 期 10, 页码 11052-11082

出版社

SPRINGER
DOI: 10.1007/s11227-021-03723-6

关键词

Cloud computing; Workload prediction; EMD; LSTM; ConvNet; GAN

向作者/读者索取更多资源

This paper introduces a novel hybrid E2LG algorithm, using deep learning and ensemble GAN/LSTM architecture to predict cloud workload time-series, improving prediction accuracy, especially for high-frequency, noise-like components.
Efficient resource demand prediction and management are two main challenges for cloud service providers in order to control dynamic autoscaling and power consumption in recent years. The behavior of cloud workload time-series at subminute scale is highly chaotic and volatile; therefore, traditional machine learning-based time-series analysis approaches fail to obtain accurate predictions. In recent years, deep learning-based schemes are suggested to predict highly nonlinear cloud workloads, but sometimes they fail to obtain excellent prediction results. Hence, demands for more accurate prediction algorithm exist. In this paper, we address this issue by proposing a hybrid E2LG algorithm, which decomposes the cloud workload time-series into its constituent components in different frequency bands using empirical mode decomposition method which reduces the complexity and nonlinearity of prediction model in each frequency band. Also, a new state-of-the-art ensemble GAN/LSTM deep learning architecture is proposed to predict each sub band workload time-series individually, based on its degree of complexity and volatility. Our novel ensemble GAN/LSTM architecture, which employs stacked LSTM blocks as its generator and 1D ConvNets as discriminator, can exploit the long-term nonlinear dependencies of cloud workload time-series effectively specially in high-frequency, noise-like components. By validating our approach using extensive set of experiments with standard real cloud workload traces, we confirm that E2LG provides significant improvements in cloud workload prediction accuracy with respect to the mean absolute and standard deviation of the prediction error and outperforming traditional and state-of-the-art deep learning approaches. It improves the prediction accuracy at least 5% and 12% in average compared to the main contemporary approaches in recent papers such as hybrid methods which employs CNN, LSTM or SVR.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据