4.6 Article

Categorical representation learning: morphism is all you need

期刊

出版社

IOP Publishing Ltd
DOI: 10.1088/2632-2153/ac2c5d

关键词

category theory; categorical representation learning; natural language processing (NLP)

向作者/读者索取更多资源

We present a construction for categorical representation learning and introduce the foundations of 'categorifier'. The central idea in representation learning is to convert everything into vectors using an encoding map. This article aims to advance representation learning to a new level using a category-theoretic approach and demonstrates the superiority of our categorical learning model over current deep learning models through a text translator example.
We provide a construction for categorical representation learning and introduce the foundations of 'categorifier'. The central theme in representation learning is the idea of everything to vector. Every object in a dataset S can be represented as a vector in R-n by an encoding map E : Obj(S) -> R-n. More importantly, every morphism can be represented as a matrix E :Hom(S) -> R-n(n). The encoding map E is generally modeled by a deep neural network. The goal of representation learning is to design appropriate tasks on the dataset to train the encoding map (assuming that an encoding is optimal if it universally optimizes the performance on various tasks). However, the latter is still a set-theoretic approach. The goal of the current article is to promote the representation learning to a new level via a category-theoretic approach. As a proof of concept, we provide an example of a text translator equipped with our technology, showing that our categorical learning model outperforms the current deep learning models by 17 times. The content of the current article is part of a US provisional patent application filed by QGNai, Inc.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据