期刊
PATTERN RECOGNITION
卷 136, 期 -, 页码 -出版社
ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2022.109183
关键词
Nearest neighbors; Classification; Hashing; Windowing operation
The K-Nearest Neighbor (KNN) algorithm is crucial in data science and machine learning. Recent research has focused on distance computations and finding the optimal value of K, which has made neighborhood extraction slow. This study proposes a fast geometrical approach for neighborhood extraction that eliminates the need for distance computations and instead creates a geometrical shape based on the number of data features.
K-Nearest Neighbor (KNN) algorithm plays a significant role in various fields of data science and machine learning. Most variants of the KNN algorithm involve distance computations and a parameter (K) that represents the required number of neighbors. The recent research regarding distance computations and finding the optimal value of K have made neighborhood extraction a slow process. This research presents a fast geometrical approach for neighborhood extraction from multi-dimensional data. Instead of distance computations, the proposed algorithm creates a geometrical shape based on the number of features of data. This geometrical shape encompasses the reference data point and the neighboring points. The pro-posed algorithm's efficiency of time, classification, and hashing are evaluated and compared with existing state-of-the-art algorithms.(c) 2022 Elsevier Ltd. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据