4.7 Article

Generalized Variant Support Vector Machine

Journal

IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS
Volume 51, Issue 5, Pages 2798-2809

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TSMC.2019.2917019

Keywords

Convex programming; exponential convergence; generalized VSVM (GVSVM); recurrent neural network (RNN); support vector machine (SVM)

Ask authors/readers for more resources

This paper introduces the generalized VSVM, which approaches the optimal solution of the standard SVM by optimizing the parameter t in the objective function. Additionally, an efficient neural network is proposed to solve the dual problem of GVSVM, with fast convergence and low complexity.
With the advancement in information technology, datasets with an enormous amount of data are available. The classification task on these datasets is more time- and memoryconsuming as the number of data increases. The support vector machine (SVM), which is arguably the most popular classification technique, has disappointing performance in dealing with large datasets due to its constrained optimization problem. To deal with this challenge, the variant SVM (VSVM) has been utilized which has the fraction (1/2)b(2) in its primal objective function, where b is the bias of the desired hyperplane. The VSVM has been solved with different optimization techniques in more time- and memory-efficient fashion. However, there is no guarantee that its optimal solution is the same as the standard SVM. In this paper, we introduce the generalized VSVM (GVSVM) which has the fraction (1/2t)b(2) in its primal objective function, for a fixed positive scalar t. Further, we present the thorough theoretical insights that indicate the optimal solution of the GVSVM tends to the optimal solution of the standard SVM as t ->infinity. One vital corollary is to derive a closed-form formula to obtain the bias term in the standard SVM. Such a formula obviates the need of approximating it, which is the modus operandi to date. An efficient neural network is then proposed to solve the GVSVM dual problem, which is asymptotically stable in the sense of Lyapunov and converges globally exponentially to the exact solution of the GVSVM. The proposed neural network has less complexity in architecture and needs fewer computations in each iteration in comparison to the existing neural solutions. Experiments confirm the efficacy of the proposed recurrent neural network and the proximity of the GVSVM and the standard SVM solutions with more significant values of t.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available