4.7 Article

DistilledCTR: Accurate and scalable CTR prediction model through model distillation

Journal

EXPERT SYSTEMS WITH APPLICATIONS
Volume 193, Issue -, Pages -

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.eswa.2021.116474

Keywords

Recommender Systems; Real-Time Bidding (RTB); Click-Through Rate (CTR); Knowledge Distillation (KD)

Ask authors/readers for more resources

This study proposes an accurate and scalable click-through rate (CTR) prediction model for real-time recommendations, which uses an ensemble method and knowledge distillation to distill multiple CTR models into a more accurate and scalable deep neural network (DNN). The low latency of the distilled model makes it suitable for deployment in real-time recommender systems.
Accuracy and scalability are critical to the efficiency and effectiveness of real-time recommender systems. Recent deep learning-based click-through rate prediction models are improving in accuracy but at the expense of computational complexity. The purpose of this study is to propose an accurate and scalable click-through rate (CTR) prediction model for real-time recommendations. This study investigates the complexity, accuracy, and scalability aspects of various CTR models. This work ensembles top CTR models using a gated network and distill into a deep neural network (DNN) using a knowledge distillation framework. Distilled DNN model is more accurate and 20x scalable than any of the individual CTR models. The low latency of distilled model makes it scalable and fit for deployment in real-time recommender systems. The proposed distillation framework is extensible to integrate any CTR models to the ensemble and can be distilled to any neural architecture.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available