4.6 Article

Pyramidal Reservoir Graph Neural Network

Journal

NEUROCOMPUTING
Volume 470, Issue -, Pages 389-404

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2021.04.131

Keywords

Reservoir Computing; Graph Echo State Networks; Graph Neural Networks; Graph pooling

Funding

  1. Visiting Fellows Programme of University of Pisa

Ask authors/readers for more resources

A deep Graph Neural Network (GNN) model is proposed, which alternates between two types of layers to achieve a trade-off between computational efficiency and accuracy in graph embedding.
We propose a deep Graph Neural Network (GNN) model that alternates two types of layers. The first type is inspired by Reservoir Computing (RC) and generates new vertex features by iterating a non-linear map until it converges to a fixed point. The second type of layer implements graph pooling operations, that gradually reduce the support graph and the vertex features, and further improve the computational efficiency of the RC-based GNN. The architecture is, therefore, pyramidal. In the last layer, the features of the remaining vertices are combined into a single vector, which represents the graph embedding. Through a mathematical derivation introduced in this paper, we show formally how graph pooling can reduce the computational complexity of the model and speed-up the convergence of the dynamical updates of the vertex features. Our proposed approach to the design of RC-based GNNs offers an advantageous and principled trade-off between accuracy and complexity, which we extensively demonstrate in experiments on a large set of graph datasets. (c) 2021 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available