期刊
ENTROPY
卷 24, 期 3, 页码 -出版社
MDPI
DOI: 10.3390/e24030374
关键词
information field theory; artificial intelligence; generative models; variational inference
Information field theory (IFT) is a mathematical framework for signal reconstruction and non-parametric inverse problems, which overlaps with artificial intelligence (AI) and machine learning (ML). This paper discusses the relation between concepts and tools in IFT and those in AI and ML research, suggesting that IFT is well suited to address many problems in AI and ML research and application.
Information field theory (IFT), the information theory for fields, is a mathematical framework for signal reconstruction and non-parametric inverse problems. Artificial intelligence (AI) and machine learning (ML) aim at generating intelligent systems, including such for perception, cognition, and learning. This overlaps with IFT, which is designed to address perception, reasoning, and inference tasks. Here, the relation between concepts and tools in IFT and those in AI and ML research are discussed. In the context of IFT, fields denote physical quantities that change continuously as a function of space (and time) and information theory refers to Bayesian probabilistic logic equipped with the associated entropic information measures. Reconstructing a signal with IFT is a computational problem similar to training a generative neural network (GNN) in ML. In this paper, the process of inference in IFT is reformulated in terms of GNN training. In contrast to classical neural networks, IFT based GNNs can operate without pre-training thanks to incorporating expert knowledge into their architecture. Furthermore, the cross-fertilization of variational inference methods used in IFT and ML are discussed. These discussions suggest that IFT is well suited to address many problems in AI and ML research and application.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据