4.5 Article

Relative Density-Ratio Estimation for Robust Distribution Comparison

期刊

NEURAL COMPUTATION
卷 25, 期 5, 页码 1324-1370

出版社

MIT PRESS
DOI: 10.1162/NECO_a_00442

关键词

-

资金

  1. JST PRESTO program
  2. MEXT KAKENHI [22700289]
  3. Aihara Project
  4. FIRST program from JSPS
  5. FIRST program
  6. SCAT
  7. AOARD
  8. CSTP
  9. [20700251]
  10. Grants-in-Aid for Scientific Research [22700289, 24500340] Funding Source: KAKEN

向作者/读者索取更多资源

Divergence estimators based on direct approximation of density ratios without going through separate approximation of numerator and denominator densities have been successfully applied to machine learning tasks that involve distribution comparison such as outlier detection, transfer learning, and two-sample homogeneity test. However, since density-ratio functions often possess high fluctuation, divergence estimation is a challenging task in practice. In this letter, we use relative divergences for distribution comparison, which involves approximation of relative density ratios. Since relative density ratios are always smoother than corresponding ordinary density ratios, our proposed method is favorable in terms of nonparametric convergence speed. Furthermore, we show that the proposed divergence estimator has asymptotic variance independent of the model complexity under a parametric setup, implying that the proposed estimator hardly overfits even with complex models. Through experiments, we demonstrate the usefulness of the proposed approach.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据