4.6 Article

Semi-Supervised Self-Training Method Based on an Optimum-Path Forest

期刊

IEEE ACCESS
卷 7, 期 -, 页码 36388-36399

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2019.2903839

关键词

Self-training method; semi-supervised classification; optimum-path forest; semi-supervised learning

资金

  1. Natural Science Foundation of China [61272194, 61502060]
  2. Chongqing Science and Technology Project [KJZH17104, cstc2017rgzn-zdyfx0040]

向作者/读者索取更多资源

Semi-supervised self-training method can train an effective classifier by exploiting labeled and unlabeled samples. Recently, a self-training method based on density peaks of data (STDP) is proposed. However, it still suffers from some shortcomings to be addressed. For example, STDP is affected by cut-off distance d(c). As a result, it is tricky for STDP to select an optimal parameter on each data set. Furthermore, STDP has a poor performance on data sets with some variations in density because of cut-off distance d(c). In order to solve these problems, we present a new self-training method which connects unlabeled and labeled samples as vertexes of an optimum path forest to discover the underlying structure of feature space. Furthermore, the underlying structure of the feature space is used to guide the self-training method to train a classifier. Compared with STDP, our algorithm is free of parameters and can work better on data sets with some variations in density. Moreover, we are surprised to find that our algorithm also has some advantages in dealing with overlapping data sets. The experimental results on real data sets clearly demonstrate that our algorithm has better performance than some previous works in improving the performance of base classifiers of k-nearest neighbor, support vector machine and cart.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据