3.8 Proceedings Paper

Graph Neural Network Sensitivity Under Probabilistic Error Model

Journal

Publisher

IEEE

Keywords

Graph signal processing; graph neural network; probabilistic error model; stability

Ask authors/readers for more resources

This paper investigates the impact of a probabilistic graph error model on the performance of graph convolutional networks (GCNs). The paper proves the upper bound of the adjacency matrix under the error model and analytically specifies the upper bound of a normalized adjacency matrix with self-loop added. Experiments on a synthetic dataset are conducted to illustrate the error bounds and study the sensitivity of a simple GCN under this probabilistic error model on accuracy.
Graph convolutional networks (GCNs) can successfully learn the graph signal representation by graph convolution. The graph convolution depends on the graph filter, which contains the topological dependency of data and propagates data features. However, the estimation errors in the propagation matrix (e.g., the adjacency matrix) can have a significant impact on graph filters and GCNs. In this paper, we study the effect of a probabilistic graph error model on the performance of the GCNs. We prove that the adjacency matrix under the error model is bounded by a function of graph size and error probability. We further analytically specify the upper bound of a normalized adjacency matrix with self-loop added. Finally, we illustrate the error bounds by running experiments on a synthetic dataset and study the sensitivity of a simple GCN under this probabilistic error model on accuracy.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available