4.7 Review

Conceptual and empirical comparison of dimensionality reduction algorithms (PCA, KPCA, LDA, MDS, SVD, LLE, ISOMAP, LE, ICA, t-SNE)

期刊

COMPUTER SCIENCE REVIEW
卷 40, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.cosrev.2021.100378

关键词

Dimension reduction; Optimal set of features; Data quality; High-dimensional datasets; Correlation metrics; Classification accuracy; Run-time

向作者/读者索取更多资源

The study aims to explore feature extraction algorithms to address the curse of dimensionality in machine learning. It reviews various FEAs from different categories and evaluates their performance on challenging datasets.
Feature Extraction Algorithms (FEAs) aim to address the curse of dimensionality that makes machine learning algorithms incompetent. Our study conceptually and empirically explores the most representative FEAs. First, we review the theoretical background of many FEAs from different categories (linear vs. nonlinear, supervised vs. unsupervised, random projection-based vs. manifold-based), present their algorithms, and conduct a conceptual comparison of these methods. Secondly, for three challenging binary and multi-class datasets, we determine the optimal sets of new features and assess the quality of the various transformed feature spaces in terms of statistical significance and power analysis, and the FEA efficacy in terms of classification accuracy and speed. (C) 2021 Elsevier Inc. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据