Journal
IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022)
Volume -, Issue -, Pages 2108-2113Publisher
IEEE
DOI: 10.1109/ICC45855.2022.9839130
Keywords
Traffic classification; self-distillation; knowledge distillation; model compression; deep learning
Categories
Funding
- National Key R&D Program of China [2020YFB0905902]
Ask authors/readers for more resources
This research focuses on the task of traffic classification, which involves identifying different types of internet traffic or applications. Traditional methods have limitations in terms of predefining features and accuracy when faced with new applications. The study proposes a novel compressed model based on a two-step distillation approach to address these challenges.
Traffic classification task is to identify different types of Internet traffic or applications. Classical traffic classification methods have the limitation that they need to predefine the features. The emergence of new applications reduces their accuracy due to the inaccurate feature design. The deep learning-based methods could extract features from raw traffic data and achieve high accuracy, but accordingly, leading to more complex models and heavier computations. The above deep models are hard to be deployed on edge nodes or resource-limited IoT devices. Therefore, in this paper, we adopt novel compressed models based on a two-step distillation approach for traffic classification. To address the trade-off problem between classification accuracy and model complexity, we first design lightweight models and then propose a novel training procedure to enhance their classification accuracy. Specifically, the response, relationship, and feature map-based knowledge of different traffic are distilled to train the small models. Experiment results demonstrate that compared to the state-of-the-art model, the minimum model using the proposed method can achieve higher accuracy, F-1 scores, and reduce nearly 99.7% computation overhead, thus verifying the effectiveness of our method.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available