Journal
MATHEMATICS
Volume 11, Issue 6, Pages -Publisher
MDPI
DOI: 10.3390/math11061397
Keywords
graph neural networks; graph representation learning; deep learning
Categories
Ask authors/readers for more resources
Graph neural networks (GNNs) have gained attention for effectively processing graph-related data. Existing methods assume noise-free input graphs, which is frequently violated in real-world scenarios. To address this issue, we introduce virtual nodes and utilize Gumbel-Softmax to reweight edges, achieving differentiable graph structure learning (abbreviated as VN-GSL). Thorough evaluations on benchmark datasets demonstrate the superiority of our approach in terms of performance and efficiency. Our implementation will be publicly available.
Graph neural networks (GNNs) have garnered significant attention for their ability to effectively process graph-related data. Most existing methods assume that the input graph is noise-free; however, this assumption is frequently violated in real-world scenarios, resulting in impaired graph representations. To address this issue, we start from the essence of graph structure learning, considering edge discovery and removal, reweighting of existing edges, and differentiability of the graph structure. We introduce virtual nodes and consider connections with virtual nodes to generate optimized graph structures, and subsequently utilize Gumbel-Softmax to reweight edges and achieve differentiability of the Graph Structure Learning (VN-GSL for abbreviation). We conducted a thorough evaluation of our method on a range of benchmark datasets under both clean and adversarial circumstances. The results of our experiments demonstrate that our approach exhibits superiority in terms of both performance and efficiency. Our implementation will be made available to the public.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available