4.1 Review

Symbolic, Distributed, and Distributional Representations for Natural Language Processing in the Era of Deep Learning: A Survey

期刊

FRONTIERS IN ROBOTICS AND AI
卷 6, 期 -, 页码 -

出版社

FRONTIERS MEDIA SA
DOI: 10.3389/frobt.2019.00153

关键词

natural language processing (NLP); distributed representation; concatenative compositionality; deep learning (DL); compositional distributional semantic models; compositionality

类别

向作者/读者索取更多资源

Natural language is inherently a discrete symbolic representation of human knowledge. Recent advances in machine learning (ML) and in natural language processing (NLP) seem to contradict the above intuition: discrete symbols are fading away, erased by vectors or tensors called distributed and distributional representations. However, there is a strict link between distributed/distributional representations and discrete symbols, being the first an approximation of the second. A clearer understanding of the strict link between distributed/distributional representations and symbols may certainly lead to radically new deep learning networks. In this paper we make a survey that aims to renew the link between symbolic representations and distributed/distributional representations. This is the right time to revitalize the area of interpreting how discrete symbols are represented inside neural networks.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.1
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据