Journal
INTERNATIONAL STATISTICAL REVIEW
Volume 88, Issue 2, Pages 302-320Publisher
WILEY
DOI: 10.1111/insr.12360
Keywords
complex data; deep learning; large scale machine learning; non-linear; non-Gaussian; shrinkage
Categories
Funding
- US National Science Foundation [DMS-1613063]
Ask authors/readers for more resources
Since the advent of the horseshoe priors for regularisation, global-local shrinkage methods have proved to be a fertile ground for the development of Bayesian methodology in machine learning, specifically for high-dimensional regression and classification problems. They have achieved remarkable success in computation and enjoy strong theoretical support. Most of the existing literature has focused on the linear Gaussian case; for which systematic surveys are available. The purpose of the current article is to demonstrate that the horseshoe regularisation is useful far more broadly, by reviewing both methodological and computational developments in complex models that are more relevant to machine learning applications. Specifically, we focus on methodological challenges in horseshoe regularisation in non-linear and non-Gaussian models, multivariate models and deep neural networks. We also outline the recent computational developments in horseshoe shrinkage for complex models along with a list of available software implementations that allows one to venture out beyond the comfort zone of the canonical linear regression problems.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available