Journal
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS
Volume 14, Issue 3, Pages 849-860Publisher
SPRINGER HEIDELBERG
DOI: 10.1007/s13042-022-01667-8
Keywords
Message passing neural networks; Neighborhood expansion; Heterophily; Network representation learning; Disassortative networks
Categories
Ask authors/readers for more resources
This article introduces a novel message passing neural network based on neighborhood expansion for disassortative network representation learning. By finding informative nodes in the neighborhood and performing data augmentation, the model improves the optimization efficiency of disassortative networks and demonstrates good performance in experiments.
Most message passing neural networks (MPNNs) are widely used for assortative network representation learning under the assumption of homophily between connected nodes. However, this fundamental assumption is inconsistent with the heterophily of disassortative networks (DNs) in many real-world applications. Therefore, we propose a novel MPNN called NEDA based on neighborhood expansion for disassortative network representation learning (DNRL). Specifically, our NEDA first performs neighborhood expansion to seek more informative nodes for aggregation and then performs data augmentation to speed up the optimization process of a set of parameter matrices at the maximum available training data with minimal computational cost. To evaluate the performance of NEDA comprehensively, we perform several experiments on benchmark disassortative network datasets with variable sizes, where the results demonstrate the effectiveness of our NEDA model. The code is publicly available at https://github.com/xueyanfeng/NEDA.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available