4.7 Article

Multiple sparse graphs condensation

Journal

KNOWLEDGE-BASED SYSTEMS
Volume 278, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.knosys.2023.110904

Keywords

Graph condensation; Graph neural network; Neighborhood pattern; Node classification

Ask authors/readers for more resources

The high complexity of graph neural networks (GNNs) on large-scale networks hinders their industrial application. Graph condensation (GCond) and Multiple Sparse Graphs Condensation (MSGC) are proposed to address this problem. MSGC, which condenses the original large-scale graph into multiple small-scale sparse graphs, allows GNNs to obtain numerous sets of embeddings, significantly enriching the diversity of embeddings. Experimental results show that MSGC has significant advantages over GCond and other baselines at the same condensed graph scale.
The high complexity of graph neural networks (GNNs) on large-scale networks hinders their industrial application. Graph condensation (GCond) was recently proposed to condense the original largescale graph into a small-scale one to address the problem. The goal is to make GNNs trained on the condensed graph perform similarly to those trained on the original graph. GCond achieves satisfactory performance on some datasets. However, GCond uses a single fully connected graph to model the condensed graph, which limits the diversity of embeddings obtained, especially when there are few synthetic nodes. We propose Multiple Sparse Graphs Condensation (MSGC), which condenses the original large-scale graph into multiple small-scale sparse graphs. MSGC takes standard neighborhood patterns as the essential substructures and can construct various connection schemes. Correspondingly, GNNs can obtain numerous sets of embeddings, which significantly enriches the diversity of embeddings. Experiments show that, compared with GCond and other baselines, MSGC has significant advantages at the same condensed graph scale. MSGC can retain nearly 100% performance on Flickr and Citeseer datasets while reducing their graph scale by over 99.0%.(c) 2023 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available