4.7 Article

TPNE: Topology preserving network embedding

Journal

INFORMATION SCIENCES
Volume 504, Issue -, Pages 20-31

Publisher

ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2019.07.035

Keywords

Network embedding; Topology preserving; Deep denoising autoencoders

Funding

  1. National key research and development program of China [2017YFB0802200]
  2. Australian Research Council [LP170100416, LP180100114]
  3. Fundamental Research Funds for Central Universities
  4. Innovation Fund of Xidian University
  5. Australian Research Council [LP180100114, LP170100416] Funding Source: Australian Research Council

Ask authors/readers for more resources

Network embedding aims at mapping nodes into a vectorial feature space while maximally preserving the topological relations of nodes in a network so as to facilitate complex network analysis. The existing works in network embedding are focused on preserving local or global topological information within limited step-sizes, which could be insufficient in many applications, even the underlying network is undirected and unweighted. The complex and rich topological information in networks exerts paramount impacts on the formation of networks and can reveal the high-order relevance among different nodes. In this paper, we propose a novel network embedding framework based on deep neural networks, named Topology Preserving Network Embedding (TPNE), which is suitable for arbitrary types of information networks: directed or undirected, weighted or unweighted. In this framework, we devise a closeness matrix to capture more comprehensive global topology of the network, and combine both global closeness reconstruction and local neighborhood preserving into a single loss function. To justify our model, we conduct extensive experiments on node classification, link prediction and node visualization tasks, employing the learned embedding vectors. It demonstrates that the proposed approach has promising performance and it outperforms the existing state-of-the-art approaches on several networks in such tasks. (C) 2019 Elsevier Inc. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available