4.7 Article

Irregular message passing networks

Journal

KNOWLEDGE-BASED SYSTEMS
Volume 257, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.knosys.2022.109919

Keywords

Graph neural networks; Message passing; Power iteration; Random propagation; Random attention

Funding

  1. National Natural Science Foundation of China [61702135]

Ask authors/readers for more resources

The effectiveness of the message aggregator in graph neural networks is not primarily due to the optimization of edge weights, but rather can be achieved through randomized attention. This finding emphasizes the importance of the network topology in achieving superior performance for message passing iterations.
The graph neural network (GNN) is a widely adopted technique to process graph-structured data. Despite its pervasiveness, the exact reasons for the message aggregator's effectiveness are still poorly understood. The popular belief is that this effectiveness stems from optimizing edge weights to improve the local fusion of node information. In this study, we demonstrate that such propagation weight optimization has a limited contribution to the success of message passing. Instead, we find that any normalized random attention (or edge weights) can have a similar and, sometime, even stronger effect. We refer to these randomly initialized propagations as irregular message passing. Experiments conducted on our random edge weight and random attention models verified the positive impact of weight randomness, uncovering the importance of the topology itself in achieving superior results for message iterations. Our code is available at https://github.com/Eigenworld/RAN. (c) 2022 Published by Elsevier B.V.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available