4.7 Article

Classifying With Adaptive Hyper-Spheres: An Incremental Classifier Based on Competitive Learning

Journal

IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS
Volume 50, Issue 4, Pages 1218-1229

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TSMC.2017.2761360

Keywords

Kernel; Heuristic algorithms; Training; Data models; Adaptation models; Subspace constraints; Neurons; Adaptive algorithms; Nystrom method; pattern clustering; self-organizing feature maps (SOFMs)

Funding

  1. National Natural Science Foundation of China [71325001, 71771037]

Ask authors/readers for more resources

Nowadays, datasets are always dynamic and patterns in them are changing. Instances with different labels are intertwined and often linearly inseparable, which bring new challenges to traditional learning algorithms. This paper proposes adaptive hyper-sphere (AdaHS), an adaptive incremental classifier, and its kernelized version: Nys-AdaHS. The classifier incorporates competitive training with a border zone. With adaptive hidden layer and tunable radii of hyper-spheres, AdaHS has strong capability of local learning like instance-based algorithms, but free from slow searching speed and excessive memory consumption. The experiments showed that AdaHS is robust, adaptive, and highly accurate. It is especially suitable for dynamic data in which patterns are changing, decision borders are complicated, and instances with the same label can be spherically clustered.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available