期刊
JOURNAL OF MACHINE LEARNING RESEARCH
卷 22, 期 -, 页码 -出版社
MICROTOME PUBL
关键词
Bayesian inference; convolution; L-p space; noisy-or; max-convolution; loopy belief propagation; sum-product inference; max-product inference; Cartesian product
资金
- National Science Foundation [1845465]
- Direct For Biological Sciences
- Div Of Biological Infrastructure [1845465] Funding Source: National Science Foundation
Various methods exist for computing marginal involving a linear Diophantine constraint on random variables, each with limitations. This study introduces a new approach, the trimmed p-convolution tree, which generalizes the applicability of existing methods and achieves better runtime. Additionally, two different methods for approximating max-convolution are introduced using Cartesian product trees.
Multiple methods exist for computing marginals involving a linear Diophantine constraint on random variables. Each of these extant methods has some limitation on the dimension and support or on the type of marginal computed (e.g., sum-product inference, max-product inference, maximum a posteriori, etc.). Here, we introduce the trimmed p-convolution tree an approach that generalizes the applicability of the existing methods and achieves a runtime within a log-factor or better compared to the best existing methods. A second form of trimming we call underflow/overflow trimming is introduced which aggregates events which land outside the supports for a random variable into the nearest support. Trimmed p-convolution trees with and without underflow/overflow trimming are used in different protein inference models. Then two different methods of approximating max-convolution using Cartesian product trees are introduced.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据