期刊
IEEE TRANSACTIONS ON CYBERNETICS
卷 52, 期 6, 页码 5380-5393出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCYB.2020.3031610
关键词
Decision tree; fuzzy k-nearest-neighbor method (FKNN); nearest neighbors; sparse representation/reconstruction
类别
资金
- National Natural Science Foundation of China [61772198, 61972181]
- Natural Science Foundation of Jiangsu Province [BK20191331]
- National First-Class Discipline Program of Light Industry and Engineering [LITE2018]
- University of Macao Talent Program
This study proposes a novel classification method based on FKNN called A-FKNN that learns the optimal k value for each testing sample, and a faster version called FA-FKNN is designed. Experimental results show that both A-FKNN and FA-FKNN outperform other methods in terms of classification accuracy, with FA-FKNN having a shorter running time.
Due to its strong performance in handling uncertain and ambiguous data, the fuzzy k-nearest-neighbor method (FKNN) has realized substantial success in a wide variety of applications. However, its classification performance would be heavily deteriorated if the number k of nearest neighbors was unsuitably fixed for each testing sample. This study examines the feasibility of using only one fixed k value for FKNN on each testing sample. A novel FKNN-based classification method, namely, fuzzy KNN method with adaptive nearest neighbors (A-FKNN), is devised for learning a distinct optimal k value for each testing sample. In the training stage, after applying a sparse representation method on all training samples for reconstruction, A-FKNN learns the optimal k value for each training sample and builds a decision tree (namely, A-FKNN tree) from all training samples with new labels (the learned optimal k values instead of the original labels), in which each leaf node stores the corresponding optimal k value. In the testing stage, A-FKNN identifies the optimal k value for each testing sample by searching the A-FKNN tree and runs FKNN with the optimal k value for each testing sample. Moreover, a fast version of A-FKNN, namely, FA-FKNN, is designed by building the FA-FKNN decision tree, which stores the optimal k value with only a subset of training samples in each leaf node. Experimental results on 32 UCI datasets demonstrate that both A-FKNN and FA-FKNN outperform the compared methods in terms of classification accuracy, and FA-FKNN has a shorter running time.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据