期刊
NEURAL NETWORKS
卷 142, 期 -, 页码 1-19出版社
PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2021.04.015
关键词
Graph autoencoders; Graph variational autoencoders; Scalability; Graph convolutional networks; Link prediction; Node clustering
This paper introduces FastGAE, a general framework to scale graph AE and VAE to large graphs, which significantly speeds up training and improves performance through an effective stochastic subgraph decoding scheme. FastGAE demonstrates effectiveness on various real-world graphs, outperforming existing approaches by a wide margin.
Graph autoencoders (AE) and variational autoencoders (VAE) are powerful node embedding methods, but suffer from scalability issues. In this paper, we introduce FastGAE, a general framework to scale graph AE and VAE to large graphs with millions of nodes and edges. Our strategy, based on an effective stochastic subgraph decoding scheme, significantly speeds up the training of graph AE and VAE while preserving or even improving performances. We demonstrate the effectiveness of FastGAE on various real-world graphs, outperforming the few existing approaches to scale graph AE and VAE by a wide margin. (C) 2021 Elsevier Ltd. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据