期刊
IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022)
卷 -, 期 -, 页码 2108-2113出版社
IEEE
DOI: 10.1109/ICC45855.2022.9839130
关键词
Traffic classification; self-distillation; knowledge distillation; model compression; deep learning
资金
- National Key R&D Program of China [2020YFB0905902]
This research focuses on the task of traffic classification, which involves identifying different types of internet traffic or applications. Traditional methods have limitations in terms of predefining features and accuracy when faced with new applications. The study proposes a novel compressed model based on a two-step distillation approach to address these challenges.
Traffic classification task is to identify different types of Internet traffic or applications. Classical traffic classification methods have the limitation that they need to predefine the features. The emergence of new applications reduces their accuracy due to the inaccurate feature design. The deep learning-based methods could extract features from raw traffic data and achieve high accuracy, but accordingly, leading to more complex models and heavier computations. The above deep models are hard to be deployed on edge nodes or resource-limited IoT devices. Therefore, in this paper, we adopt novel compressed models based on a two-step distillation approach for traffic classification. To address the trade-off problem between classification accuracy and model complexity, we first design lightweight models and then propose a novel training procedure to enhance their classification accuracy. Specifically, the response, relationship, and feature map-based knowledge of different traffic are distilled to train the small models. Experiment results demonstrate that compared to the state-of-the-art model, the minimum model using the proposed method can achieve higher accuracy, F-1 scores, and reduce nearly 99.7% computation overhead, thus verifying the effectiveness of our method.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据