4.8 Article

DeepNC: Deep Generative Network Completion

Journal

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/TPAMI.2020.3032286

Keywords

Autoregressive generative model; deep generative model of graphs; inference; network completion; partially observable network

Funding

  1. Republic of Korea's MSIT (Ministry of Science and ICT) [2020-0-01463]
  2. Korea Health Industry Development Institute (KHIDI) - Ministry of Health & Welfare, Republic of Korea [HI20C0127]
  3. Yonsei University [2020-22-0101]
  4. National Research Foundation of Korea [4120200413615] Funding Source: Korea Institute of Science & Technology Information (KISTI), National Science & Technology Information Service (NTIS)

Ask authors/readers for more resources

In this paper, the authors propose a novel method called DeepNC for inferring the missing parts of a network based on a deep generative model. The method first learns the likelihood of edges and then identifies the graph that maximizes the learned likelihood conditioned on the observable graph topology. The authors empirically demonstrate the superiority of DeepNC over state-of-the-art network completion approaches.
Most network data are collected from partially observable networks with both missing nodes and missing edges, for example, due to limited resources and privacy settings specified by users on social media. Thus, it stands to reason that inferring the missing parts of the networks by performing network completion should precede downstream applications. However, despite this need, the recovery of missing nodes and edges in such incomplete networks is an insufficiently explored problem due to the modeling difficulty, which is much more challenging than link prediction that only infers missing edges. In this paper, we present DeepNC, a novel method for inferring the missing parts of a network based on a deep generative model of graphs. Specifically, our method first learns a likelihood over edges via an autoregressive generative model, and then identifies the graph that maximizes the learned likelihood conditioned on the observable graph topology. Moreover, we propose a computationally efficient DeepNC algorithm that consecutively finds individual nodes that maximize the probability in each node generation step, as well as an enhanced version using the expectation-maximization algorithm. The runtime complexities of both algorithms are shown to be almost linear in the number of nodes in the network. We empirically demonstrate the superiority of DeepNC over state-of-the-art network completion approaches.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available