4.6 Article

Multiview Learning With Generalized Eigenvalue Proximal Support Vector Machines

Journal

IEEE TRANSACTIONS ON CYBERNETICS
Volume 49, Issue 2, Pages 688-697

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCYB.2017.2786719

Keywords

Co-regularization; eigenvalue problem; generalized eigenvalue problem; generalized eigenvalue proximal support vector machines (GEPSVMs); multiview learning (MVL)

Funding

  1. National Natural Science Foundation of China [61673179, 61370175]
  2. Natural Science Foundation of Zhejiang Province [LQ18F020001]

Ask authors/readers for more resources

Generalized eigenvalue proximal support vector machines (GEPSVMs) are a simple and effective binary classification method in which each hyperplane is closest to one of the two classes and as far as possible from the other class. They solve a pair of generalized eigenvalue problems to obtain two non-parallel hyperplanes. Multiview learning considers learning with multiple feature sets to improve the learning performance. In this paper, we propose multiview GEPSVMs (MvGSVMs) which effectively combine two views by introducing a multiview co-regularization term to maximize the consensus on distinct views, and skillfully transform a complicated optimization problem to a simple generalized eigenvalue problem. We also propose multiview improved GEPSVMs (MvIGSVMs), which use the minus instead of ratio in MvGSVMs to measure the differences of the distances between the two classes and the hyperplane and lead to a simpler eigenvalue problem. Linear MvGSVMs and MvIGSVMs are generalized to the nonlinear case by the kernel trick. Experimental results on multiple data sets show the effectiveness of our proposed approaches.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available