4.6 Article

Relphormer: Relational Graph Transformer for Knowledge Graph Representations

期刊

NEUROCOMPUTING
卷 566, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2023.127044

关键词

Knowledge graph; Knowledge graph representation; Transformer

向作者/读者索取更多资源

In this paper, we propose a new variant of Transformer called Relphormer for knowledge graph representations. We introduce Triple2Seq to dynamically sample contextualized sub-graph sequences as input, alleviating the heterogeneity issue. We also propose a novel structure-enhanced self-attention mechanism to encode relational information. Experimental results show that Relphormer outperforms baseline models.
Transformers have achieved remarkable performance in widespread fields, including natural language processing, computer vision and graph mining. However, vanilla Transformer architectures have not yielded promising improvements in the Knowledge Graph (KG) representations, where the translational distance paradigm dominates this area. Note that vanilla Transformer architectures struggle to capture the intrinsically heterogeneous structural and semantic information of knowledge graphs. To this end, we propose a new variant of Transformer for knowledge graph representations dubbed Relphormer. Specifically, we introduce Triple2Seq which can dynamically sample contextualized sub-graph sequences as the input to alleviate the heterogeneity issue. We propose a novel structure-enhanced self-attention mechanism to encode the relational information and keep the semantic information within entities and relations. Moreover, we utilize masked knowledge modeling for general knowledge graph representation learning, which can be applied to various KG-based tasks including knowledge graph completion, question answering, and recommendation. Experimental results on six datasets show that Relphormer can obtain better performance compared with baselines.2

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据