期刊
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
卷 26, 期 10, 页码 2357-2369出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNNLS.2014.2382123
关键词
Additive kernels; large-scale classification; linear regression; Nystrom approximation; SVM
类别
资金
- National Natural Science Foundation of China [61422203]
- Fundamental Research Funds for the Central Universities [20620140498]
- Collaborative Innovation Center of Novel Software Technology and Industrialization
For large-scale classification tasks, especially in the classification of images, additive kernels have shown a state-of-the-art accuracy. However, even with the recent development of fast algorithms, learning speed and the ability to handle large-scale tasks are still open problems. This paper proposes algorithms for large-scale support vector machines (SVM) classification and other tasks using additive kernels. First, a linear regression SVM framework for general nonlinear kernel is proposed using linear regression to approximate gradient computations in the learning process. Second, we propose a power mean SVM (PmSVM) algorithm for all additive kernels using nonsymmetric explanatory variable functions. This nonsymmetric kernel approximation has advantages over the existing methods: 1) it does not require closed-form Fourier transforms and 2) it does not require extra training for the approximation either. Compared on benchmark large-scale classification data sets with millions of examples or millions of dense feature dimensions, PmSVM has achieved the highest learning speed and highest accuracy among recent algorithms in most cases.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据