4.5 Article

Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence

期刊

NEURAL COMPUTATION
卷 30, 期 7, 页码 1930-1960

出版社

MIT PRESS
DOI: 10.1162/neco_a_01092

关键词

-

资金

  1. JST CREST [JPMJCR1403]
  2. KAKENHI [RAS 15H06823]
  3. BK21Plus [MITIP-10048320]
  4. U.S. NSF
  5. NIH
  6. ONR
  7. ARL
  8. AFOSR
  9. DOT
  10. DARPA
  11. [NRF/MSIT-2017R1E1A1A03070945]

向作者/读者索取更多资源

Nearest-neighbor estimators for the Kullback-Leiber (KL) divergence that are asymptotically unbiased have recently been proposed and demonstrated in a number of applications. However, with a small number of samples, nonparametric methods typically suffer from large estimation bias due to the nonlocality of information derived from nearest-neighbor statistics. In this letter, we show that this estimation bias can be mitigated by modifying the metric function, and we propose a novel method for learning a locally optimal Mahalanobis distance function from parametric generative models of the underlying density distributions. Using both simulations and experiments on a variety of data sets, we demonstrate that this interplay between approximate generative models and nonparametric techniques can significantly improve the accuracy of nearest-neighbor-based estimation of the KL divergence.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据