4.5 Article

Reductive and effective discriminative information-based nonparallel support vector machine

期刊

APPLIED INTELLIGENCE
卷 52, 期 7, 页码 8259-8278

出版社

SPRINGER
DOI: 10.1007/s10489-021-02874-6

关键词

Twin support vector machine; Discriminative information; K-nearest neighbors; Support vector; Least squares

资金

  1. National Natural Science Foundation of China [12071475]
  2. Fundamental Research Funds for the Central Universities [BLX201928]

向作者/读者索取更多资源

In this paper, a novel algorithm called REDINPSVM is proposed to improve the performance of DINPSVM. The algorithm introduces a regularization term to achieve structural risk minimization, applies the k-nearest neighbor method to eliminate some redundant constraints, and uses the least squares technique to accelerate computation. Comprehensive experimental results on various datasets demonstrate the validity of the proposed method.
In the paper, to improve the performance of discriminative information-based nonparallel support vector machine (DINPSVM), we propose a novel algorithm called reductive and effective discriminative information-based nonparallel support vector machine (REDINPSVM). First, we introduce the regularization term to achieve the structural risk minimization principle. This embodies the marrow of statistical learning theory, so this modification can enhance the generalization ability of classification algorithms. Second, we apply the k-nearest neighbor method to eliminate some redundant constraints that would cut down on time complexity. Finally, to accelerate the computation, we introduce the least squares technique to solve two systems of linear equations. Comprehensive experimental results on twenty-three UCI benchmark datasets and six Image datasets demonstrate the validity of the proposed method.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据