4.7 Article

Generalized Translation-Based Embedding of Knowledge Graph

Journal

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/TKDE.2019.2893920

Keywords

Knowledge graph embedding

Funding

  1. New Energy and Industrial Technology Development Organization (NEDO)

Ask authors/readers for more resources

Knowledge graphs are useful for many AI tasks but often have missing facts. To populate the graphs, knowledge graph embedding models have been developed. TransE is one of such models and the first translation-based method. TransE is well known because the principle of TransE can effectively capture the rules of a knowledge graph although it seems very simple. However, TransE has problems with its regularization and an unchangeable ratio of negative sampling. In this paper, we generalize TransE to solve these problems by proposing knowledge graph embedding on a Lie group (KGLG) and the Weighted Negative Part (WNP) method for the objective function of translation-based models. KGLG is the novel translation-based method which embeds entities and relations of a knowledge graph on any Lie group. It allows us not to employ regularization during training of the model if we choose a compact lie group for the embedding space. The WNP method is for changing the ratio of negative sampling, which enhances translation-based models. Our approach outperforms other state-of-the-art approaches such as TransE, DistMult, and ComplEx on a standard link prediction task. We show that TorusE, KGLG on a torus, is scalable to large-size knowledge graphs and faster than the original TransE.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available