4.5 Article

Bayesian Citation-KNN with distance weighting

出版社

SPRINGER HEIDELBERG
DOI: 10.1007/s13042-013-0152-x

关键词

Multi-instance learning; KNN; Bayesian-KNN; Citation-KNN; Bayesian Citation-KNN; Distance weighting

资金

  1. National Natural Science Foundation of China [61203287]
  2. Program for New Century Excellent Talents in University [NCET-12-0953]
  3. Provincial Natural Science Foundation of Hubei [2011CDA103]
  4. Fundamental Research Funds for the Central Universities

向作者/读者索取更多资源

Multi-instance (MI) learning is receiving growing attention in the machine learning research field, in which learning examples are represented by a bag of instances instead of a single instance. K-nearest-neighbor (KNN) is a simple and effective classification model in the traditional supervised learning. As its two variants, Bayesian-KNN (BKNN) and Citation-KNN (CKNN) are proposed and are widely used for solving multi-instance classification problems. However, CKNN still applies the simplest majority vote approach among the references and citers to classify unseen bags. In this paper, we propose an improved algorithm called Bayesian Citation-KNN (BCKNN). For each unseen bag, BCKNN firstly finds its k references and q citers respectively, and then a Bayesian approach is applied to its k references and a distance weighted majority vote approach is applied to its q citers. The experimental results on several benchmark datasets show that our BCKNN is generally better than previous BKNN and CKNN. Besides, BCKNN almost maintains the same order of computational overhead as CKNN.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据