4.6 Article

History Marginalization Improves Forecasting in Variational Recurrent Neural Networks

期刊

ENTROPY
卷 23, 期 12, 页码 -

出版社

MDPI
DOI: 10.3390/e23121563

关键词

sequential latent variable models; time series forecasting; variational inference

向作者/读者索取更多资源

The importance of deep probabilistic time series forecasting models is highlighted in the article, pointing out that existing generative models' inference models are often too limited, resulting in overly averaged dynamics in predictions. A variational dynamic mixtures (VDM) model is developed to address this issue by capturing multi-modality. Empirical studies show that VDM outperforms other methods in handling highly multi-modal datasets.
Deep probabilistic time series forecasting models have become an integral part of machine learning. While several powerful generative models have been proposed, we provide evidence that their associated inference models are oftentimes too limited and cause the generative model to predict mode-averaged dynamics. Mode-averaging is problematic since many real-world sequences are highly multi-modal, and their averaged dynamics are unphysical (e.g., predicted taxi trajectories might run through buildings on the street map). To better capture multi-modality, we develop variational dynamic mixtures (VDM): a new variational family to infer sequential latent variables. The VDM approximate posterior at each time step is a mixture density network, whose parameters come from propagating multiple samples through a recurrent architecture. This results in an expressive multi-modal posterior approximation. In an empirical study, we show that VDM outperforms competing approaches on highly multi-modal datasets from different domains.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据