4.7 Article

SoulMate: Short-Text Author Linking Through Multi-Aspect Temporal-Textual Embedding

期刊

出版社

IEEE COMPUTER SOC
DOI: 10.1109/TKDE.2020.2982148

关键词

Author linking; short text inference; Word2Vec; temporally multifaceted; semantic understanding

资金

  1. ST Electronics
  2. National Research Foundation(NRF), Prime Minister's Office, Singapore under Corporate Laboratory @ University Scheme (Programme Title: STEE Infosec -SUTD Corporate Laborator)

向作者/读者索取更多资源

The paper introduces a neural network-based temporal-textual framework to generate subgraphs with highly correlated authors from short-text contents. The method calculates relevance scores between authors through a combination of content and concepts, and extracts communities of related authors using a stack-wise graph cutting algorithm.
Linking authors of short-text contents has important usages in many applications, including Named Entity Recognition (NER) and human community detection. However, certain challenges lie ahead. First, the input short-text contents are noisy, ambiguous, and do not follow the grammatical rules. Second, traditional text mining methods fail to effectively extract concepts through words and phrases. Third, the textual contents are temporally skewed, which can affect the semantic understanding by multiple time facets. Finally, using knowledge-bases can make the results biased to the content of the external database and deviate the meaning from the input short text corpus. To overcome these challenges, we devise a neural network-based temporal-textual framework that generates the subgraphs with highly correlated authors from short-text contents. Our approach, on the one hand, computes the relevance score (edge weight) between the authors through considering a portmanteau of contents and concepts, and on the other hand, employs a stack-wise graph cutting algorithm to extract the communities of the related authors. Experimental results show that compared to other knowledge-centered competitors, our multi-aspect vector space model can achieve a higher performance in linking short-text authors. In addition, given the author linking task, the more comprehensive the dataset is, the higher the significance of the extracted concepts will be.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据