期刊
JOURNAL OF MACHINE LEARNING RESEARCH
卷 22, 期 -, 页码 -出版社
MICROTOME PUBL
关键词
Ising Model; Chow-Liu Algorithm; Structure Learning; Predictive Learning; Distribution Estimation; Noisy Data; Hidden Markov Random Fields
This paper provides high-probability sample complexity guarantees for exact structure recovery and accurate predictive learning in tree-shaped graphical models. By quantifying the impact of noise in the hidden model on tasks like structure recovery and marginal distribution estimation, the study presents upper and lower bounds on sample complexity, generalizing prior work and recovering noiseless cases.
We provide high-probability sample complexity guarantees for exact structure recovery and accurate predictive learning using noise-corrupted samples from an acyclic (tree-shaped) graphical model. The hidden variables follow a tree-structured Ising model distribution, whereas the observable variables are generated by a binary symmetric channel taking the hidden variables as its input (flipping each bit independently with some constant probability q is an element of [0, 1/2)). In the absence of noise, predictive learning on Ising models was recently studied by Bresler and Karzand (2020); this paper quantifies how noise in the hidden model impacts the tasks of structure recovery and marginal distribution estimation by proving upper and lower bounds on the sample complexity. Our results generalize state-of-the-art bounds reported in prior work, and they exactly recover the noiseless case (q = 0). In fact, for any tree with p vertices and probability of incorrect recovery delta > 0, the sufficient number of samples remains logarithmic as in the noiseless case, i.e., O(log(p/(delta)), while the dependence on q is O(1/(1 - 2q)(4)), for both aforementioned tasks. We also present a new equivalent of Isserlis' Theorem for sign-valued tree-structured distributions, yielding a new low-complexity algorithm for higher-order moment estimation.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据