Journal
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS
Volume 12, Issue 2, Pages 466-475Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TITS.2010.2093575
Keywords
Bayesian learning; Dirichlet process; Gaussian process; traffic flow prediction; variational inference
Categories
Funding
- National Natural Science Foundation of China [61075005]
Ask authors/readers for more resources
This paper proposes a new variational approximation for infinite mixtures of Gaussian processes. As an extension of the single Gaussian process regression model, mixtures of Gaussian processes can characterize varying covariances or multimodal data and reduce the deficiency of the computationally cubic complexity of the single Gaussian process model. The infinite mixture of Gaussian processes further integrates a Dirichlet process prior to allowing the number of mixture components to automatically be determined from data. We use variational inference and a truncated stick-breaking representation of the Dirichlet process to approximate the posterior of hidden variables involved in the model. To fix the hyperparameters of the model, the variational EM algorithm and a greedy algorithm are employed. In addition to presenting the variational infinite-mixture model, we apply it to the problem of traffic flow prediction. Experiments with comparisons to other approaches show the effectiveness of the proposed model.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available