4.6 Article

Machine-Learning-Based Prediction for Resource (Re)allocation in Optical Data Center Networks

期刊

出版社

Optica Publishing Group
DOI: 10.1364/JOCN.10.000D12

关键词

Defragmentation; Holding time; Machine learning; Optical data center networks; Prediction; Space-division multiplexing

向作者/读者索取更多资源

Traffic prediction and utilization of past information are essential requirements for intelligent and efficient management of resources, especially in optical data center networks (ODCNs), which serve diverse applications. In this paper, we consider the problem of traffic aggregation in ODCNs by leveraging the predictable or exact knowledge of application-specific information and requirements, such as holding time, bandwidth, traffic history, and latency. As ODCNs serve diverse flows (e.g., long/elephant and short/mice), we utilize machine learning (ML) for prediction of time-varying traffic and connection blocking in ODCNs. Furthermore, with the predicted mean service time, passed time is utilized to estimate the mean residual life (MRL) of an active flow (connection). The MRL information is used for dynamic traffic aggregation while allocating resources to a new connection request. Additionally, blocking rate is predicted for a future time interval based on the predicted traffic and past blocking information, which is used to trigger a spectrum reallocation process (also called defragmentation) to reduce spectrum fragmentation resulting from the dynamic connection setup and tearing-down scenarios. Simulation results show that ML-based prediction and initial setup times (history) of traffic flows can be used to further improve connection blocking and resource utilization in space-division multiplexed ODCNs.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据