Journal
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
Volume 32, Issue 3, Pages 1400-1406Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNNLS.2020.2980559
Keywords
Support vector machines; Optimization; Kernel; Mixture models; Learning systems; Euclidean distance; Data mining; Classification; kernel; machine learning; neural networks
Categories
Funding
- CNPq, Brazil [150254/2016-4]
Ask authors/readers for more resources
This brief introduces a geometrical approach for obtaining large margin classifiers by exploring the geometrical properties of the data set through a Gabriel graph and Gaussian mixture model. Experimental results show that the proposed method is statistically equivalent to using SVMs for obtaining solutions. Furthermore, this method does not require optimization and can be extended to large data sets using the cascade SVM concept.
This brief presents a geometrical approach for obtaining large margin classifiers. The method aims at exploring the geometrical properties of the data set from the structure of a Gabriel graph, which represents pattern relations according to a given distance metric, such as the Euclidean distance. Once the graph is generated, geometrical support vectors (SVs) (analogous to support vector machines (SVMs) SVs) are obtained in order to yield the final large margin solution from a Gaussian mixture model. Experiments with 20 data sets have shown that the solutions obtained with the proposed method are statistically equivalent to those obtained with SVMs. However, the present method does not require optimization and can also be extended to large data sets using the cascade SVM concept.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available