Journal
PATTERN RECOGNITION
Volume 136, Issue -, Pages -Publisher
ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2022.109183
Keywords
Nearest neighbors; Classification; Hashing; Windowing operation
Ask authors/readers for more resources
The K-Nearest Neighbor (KNN) algorithm is crucial in data science and machine learning. Recent research has focused on distance computations and finding the optimal value of K, which has made neighborhood extraction slow. This study proposes a fast geometrical approach for neighborhood extraction that eliminates the need for distance computations and instead creates a geometrical shape based on the number of data features.
K-Nearest Neighbor (KNN) algorithm plays a significant role in various fields of data science and machine learning. Most variants of the KNN algorithm involve distance computations and a parameter (K) that represents the required number of neighbors. The recent research regarding distance computations and finding the optimal value of K have made neighborhood extraction a slow process. This research presents a fast geometrical approach for neighborhood extraction from multi-dimensional data. Instead of distance computations, the proposed algorithm creates a geometrical shape based on the number of features of data. This geometrical shape encompasses the reference data point and the neighboring points. The pro-posed algorithm's efficiency of time, classification, and hashing are evaluated and compared with existing state-of-the-art algorithms.(c) 2022 Elsevier Ltd. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available