4.7 Article

Large-Scale Minimal Complexity Machines Using Explicit Feature Maps

期刊

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TSMC.2017.2694321

关键词

Fastfood; minimal complexity machines (MCMs); random Fourier features; stochastic gradient descent; Vapnik-Chervonenkis (VC) dimension

资金

  1. Indian Institute of Technology Delhi (IIT Delhi) through the Microsoft Chair Professor Project [MI01158]

向作者/读者索取更多资源

Minimal complexity machines (MCMs) are a class of hyperplane classifiers that try to minimize a tight bound on the Vapnik-Chervonenkis dimension. MCMs can be used both in the input space and in a higher dimensional feature space via the kernel trick. MCMs tend to produce very sparse solutions in comparison to support vector machines, often using three to ten times fewer support vectors. However, large datasets present significant challenges in terms of storage and operations on the kernel matrix. In this paper, we present a stochastic subgradient descent solver for large-scale machine learning with the MCM. The proposed approach uses an explicit feature map-based approximation of the kernel, to improve the scalability of the algorithm.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据