期刊
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS
卷 14, 期 3, 页码 849-860出版社
SPRINGER HEIDELBERG
DOI: 10.1007/s13042-022-01667-8
关键词
Message passing neural networks; Neighborhood expansion; Heterophily; Network representation learning; Disassortative networks
This article introduces a novel message passing neural network based on neighborhood expansion for disassortative network representation learning. By finding informative nodes in the neighborhood and performing data augmentation, the model improves the optimization efficiency of disassortative networks and demonstrates good performance in experiments.
Most message passing neural networks (MPNNs) are widely used for assortative network representation learning under the assumption of homophily between connected nodes. However, this fundamental assumption is inconsistent with the heterophily of disassortative networks (DNs) in many real-world applications. Therefore, we propose a novel MPNN called NEDA based on neighborhood expansion for disassortative network representation learning (DNRL). Specifically, our NEDA first performs neighborhood expansion to seek more informative nodes for aggregation and then performs data augmentation to speed up the optimization process of a set of parameter matrices at the maximum available training data with minimal computational cost. To evaluate the performance of NEDA comprehensively, we perform several experiments on benchmark disassortative network datasets with variable sizes, where the results demonstrate the effectiveness of our NEDA model. The code is publicly available at https://github.com/xueyanfeng/NEDA.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据