4.7 Article

DistilledCTR: Accurate and scalable CTR prediction model through model distillation

期刊

EXPERT SYSTEMS WITH APPLICATIONS
卷 193, 期 -, 页码 -

出版社

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.eswa.2021.116474

关键词

Recommender Systems; Real-Time Bidding (RTB); Click-Through Rate (CTR); Knowledge Distillation (KD)

向作者/读者索取更多资源

This study proposes an accurate and scalable click-through rate (CTR) prediction model for real-time recommendations, which uses an ensemble method and knowledge distillation to distill multiple CTR models into a more accurate and scalable deep neural network (DNN). The low latency of the distilled model makes it suitable for deployment in real-time recommender systems.
Accuracy and scalability are critical to the efficiency and effectiveness of real-time recommender systems. Recent deep learning-based click-through rate prediction models are improving in accuracy but at the expense of computational complexity. The purpose of this study is to propose an accurate and scalable click-through rate (CTR) prediction model for real-time recommendations. This study investigates the complexity, accuracy, and scalability aspects of various CTR models. This work ensembles top CTR models using a gated network and distill into a deep neural network (DNN) using a knowledge distillation framework. Distilled DNN model is more accurate and 20x scalable than any of the individual CTR models. The low latency of distilled model makes it scalable and fit for deployment in real-time recommender systems. The proposed distillation framework is extensible to integrate any CTR models to the ensemble and can be distilled to any neural architecture.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据