3.8 Proceedings Paper

Pre-Training Graph Neural Networks for Cold-Start Users and Items Representation

Publisher

ASSOC COMPUTING MACHINERY
DOI: 10.1145/3437963.3441738

Keywords

Pre-training; graph neural networks; cold-start; recommendation

Funding

  1. National Key R&D Program of China [2018YFB1004401]
  2. NSFC [61532021, 61772537, 61772536, 61702522, 62076245]
  3. CCF-Tencent Open Fund
  4. Australian Research Council [DP190101985, DP170103954]

Ask authors/readers for more resources

This paper proposes to pretrain a GNN model before applying it for recommendation, and introduces a self-attention-based meta aggregator and an adaptive neighbor sampler to effectively handle the cold-start problem in recommendation tasks, improving the quality of recommendations.
Cold-start problem is a fundamental challenge for recommendation tasks. Despite the recent advances on Graph Neural Networks (GNNs) incorporate the high-order collaborative signal to alleviate the problem, the embeddings of the cold-start users and items aren't explicitly optimized, and the cold-start neighbors are not dealt with during the graph convolution in GNNs. This paper proposes to pretrain a GNN model before applying it for recommendation. Unlike the goal of recommendation, the pre-training GNN simulates the cold-start scenarios from the users/items with sufficient interactions and takes the embedding reconstruction as the pretext task, such that it can directly improve the embedding quality and can be easily adapted to the new cold-start users/items. To further reduce the impact from the cold-start neighbors, we incorporate a self-attention-based meta aggregator to enhance the aggregation ability of each graph convolution step, and an adaptive neighbor sampler to select the effective neighbors according to the feedbacks from the pre-training GNN model. Experiments on three public recommendation datasets show the superiority of our pre-training GNN model against the original GNN models on user/item embedding inference and the recommendation task.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available