4.7 Article

Smoothing unadjusted Langevin algorithms for nonsmooth composite potential functions

期刊

APPLIED MATHEMATICS AND COMPUTATION
卷 464, 期 -, 页码 -

出版社

ELSEVIER SCIENCE INC
DOI: 10.1016/j.amc.2023.128377

关键词

Bayesian learning; Nonsmooth sampling; Convex optimization; MCMC methods; Langevin equation

向作者/读者索取更多资源

This paper proposes a gradient-based Markov Chain Monte Carlo (MCMC) method for sampling from the posterior distribution of problems with nonsmooth potential functions. By using smoothing techniques, the original potential function is approximated by a smooth function with the same critical points, leading to a smoothing ULA method called SULA. Non-asymptotic convergence results of SULA are established under mild assumptions on the original potential function. Numerical results demonstrate the promising performance of SULA on both synthetic and real chemoinformatics data.
This paper addresses a gradient-based Markov Chain Monte Carlo (MCMC) method to sample from the posterior distribution of problems with nonsmooth potential functions. Following the Bayesian paradigm, our potential function will be some of two convex functions, where one of which is smooth. We first approximate the potential function by the so-called forward-backward envelope function, which is a real-valued smooth function with the same critical points as the original one. Then, we incorporate this smoothing technique with the unadjusted Langevin algorithm (ULA), leading to smoothing ULA, called SULA. We next establish non-asymptotic convergence results of SULA under mild assumption on the original potential function. We finally report some numerical results to establish the promising performance of SULA on both synthetic and real chemoinformatics data.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据