4.7 Article

Expressivity of Hidden Markov Chains vs. Recurrent Neural Networks From a System Theoretic Viewpoint

期刊

IEEE TRANSACTIONS ON SIGNAL PROCESSING
卷 71, 期 -, 页码 4178-4191

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TSP.2023.3328108

关键词

Hidden Markov models; Computational modeling; Time series analysis; Buildings; Bayes methods; Recurrent neural networks; Predictive models; Hidden Markov Chains; Recurrent Neural Networks; generative models; expressivity; modeling power; stochastic realization theory

向作者/读者索取更多资源

In this paper, HMC and RNN are considered as generative models and embedded in a common GUM. The expressivity of these models is compared by assuming linearity and Gaussianity, and using structured covariance series to characterize the probability distributions.
Hidden Markov Chains (HMC) and Recurrent Neural Networks (RNN) are two well known tools for predicting time series. Even though these solutions were developed independently in distinct communities, they share some similarities when considered as probabilistic structures. So in this paper we first consider HMC and RNN as generative models, and we embed both structures in a common generative unified model (GUM). We next address a comparative study of the expressivity (or modeling power) of these models, which here refers to the range of the joint probability distribution of an observations sequence, induced by the underlying latent variables. To that end we assume that the models are furthermore linear and Gaussian. The probability distributions produced by these models are characterized by structured covariance series, and as a consequence expressivity reduces to comparing sets of structured covariance series, which enables us to call for stochastic realization theory (SRT). We finally provide conditions under which a given covariance series can be realized by a GUM, an HMC or an RNN.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据