4.6 Article

Fuzzy KNN Method With Adaptive Nearest Neighbors

Journal

IEEE TRANSACTIONS ON CYBERNETICS
Volume 52, Issue 6, Pages 5380-5393

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCYB.2020.3031610

Keywords

Decision tree; fuzzy k-nearest-neighbor method (FKNN); nearest neighbors; sparse representation/reconstruction

Funding

  1. National Natural Science Foundation of China [61772198, 61972181]
  2. Natural Science Foundation of Jiangsu Province [BK20191331]
  3. National First-Class Discipline Program of Light Industry and Engineering [LITE2018]
  4. University of Macao Talent Program

Ask authors/readers for more resources

This study proposes a novel classification method based on FKNN called A-FKNN that learns the optimal k value for each testing sample, and a faster version called FA-FKNN is designed. Experimental results show that both A-FKNN and FA-FKNN outperform other methods in terms of classification accuracy, with FA-FKNN having a shorter running time.
Due to its strong performance in handling uncertain and ambiguous data, the fuzzy k-nearest-neighbor method (FKNN) has realized substantial success in a wide variety of applications. However, its classification performance would be heavily deteriorated if the number k of nearest neighbors was unsuitably fixed for each testing sample. This study examines the feasibility of using only one fixed k value for FKNN on each testing sample. A novel FKNN-based classification method, namely, fuzzy KNN method with adaptive nearest neighbors (A-FKNN), is devised for learning a distinct optimal k value for each testing sample. In the training stage, after applying a sparse representation method on all training samples for reconstruction, A-FKNN learns the optimal k value for each training sample and builds a decision tree (namely, A-FKNN tree) from all training samples with new labels (the learned optimal k values instead of the original labels), in which each leaf node stores the corresponding optimal k value. In the testing stage, A-FKNN identifies the optimal k value for each testing sample by searching the A-FKNN tree and runs FKNN with the optimal k value for each testing sample. Moreover, a fast version of A-FKNN, namely, FA-FKNN, is designed by building the FA-FKNN decision tree, which stores the optimal k value with only a subset of training samples in each leaf node. Experimental results on 32 UCI datasets demonstrate that both A-FKNN and FA-FKNN outperform the compared methods in terms of classification accuracy, and FA-FKNN has a shorter running time.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available