3.8 Proceedings Paper

Node Similarity Preserving Graph Convolutional Networks

Publisher

ASSOC COMPUTING MACHINERY
DOI: 10.1145/3437963.3441735

Keywords

-

Funding

  1. National Science Foundation of the United States [CNS1815636, IIS1928278, IIS1714741, IIS1845081, IIS1907704, IIS1955285]
  2. Beijing Nova Program [Z201100006820068]
  3. Beijing Municipal Science & Technology Commission

Ask authors/readers for more resources

The SimP-GCN framework balances information from graph structure and node features by preserving feature similarity during aggregation process, effectively maintaining node similarity.
Graph Neural Networks (GNNs) have achieved tremendous success in various real-world applications due to their strong ability in graph representation learning. GNNs explore the graph structure and node features by aggregating and transforming information within node neighborhoods. However, through theoretical and empirical analysis, we reveal that the aggregation process of GNNs tends to destroy node similarity in the original feature space. There are many scenarios where node similarity plays a crucial role. Thus, it has motivated the proposed framework SimP-GCN that can effectively and efficiently preserve node similarity while exploiting graph structure. Specifically, to balance information from graph structure and node features, we propose a feature similarity preserving aggregation which adaptively integrates graph structure and node features. Furthermore, we employ self-supervised learning to explicitly capture the complex feature similarity and dissimilarity relations between nodes. We validate the effectiveness of SimP-GCN on seven benchmark datasets including three assortative and four disassorative graphs. The results demonstrate that SimP-GCN outperforms representative baselines. Further probe shows various advantages of the proposed framework. The implementation of SimP-GCN is available at https://github.com/ChandlerBang/SimP-GCN.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available