期刊
NEURAL COMPUTATION
卷 30, 期 7, 页码 1930-1960出版社
MIT PRESS
DOI: 10.1162/neco_a_01092
关键词
-
资金
- JST CREST [JPMJCR1403]
- KAKENHI [RAS 15H06823]
- BK21Plus [MITIP-10048320]
- U.S. NSF
- NIH
- ONR
- ARL
- AFOSR
- DOT
- DARPA
- [NRF/MSIT-2017R1E1A1A03070945]
Nearest-neighbor estimators for the Kullback-Leiber (KL) divergence that are asymptotically unbiased have recently been proposed and demonstrated in a number of applications. However, with a small number of samples, nonparametric methods typically suffer from large estimation bias due to the nonlocality of information derived from nearest-neighbor statistics. In this letter, we show that this estimation bias can be mitigated by modifying the metric function, and we propose a novel method for learning a locally optimal Mahalanobis distance function from parametric generative models of the underlying density distributions. Using both simulations and experiments on a variety of data sets, we demonstrate that this interplay between approximate generative models and nonparametric techniques can significantly improve the accuracy of nearest-neighbor-based estimation of the KL divergence.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据