期刊
ANNALS OF STATISTICS
卷 49, 期 6, 页码 3153-3180出版社
INST MATHEMATICAL STATISTICS-IMS
DOI: 10.1214/21-AOS2078
关键词
Curse of dimensionality; support vector machines; learning rates; regression; classification
资金
- International Max Planck Research School for Intelligent Systems (IMPRS-IS)
- German Research Foundation under DFG [STE 1074/4-1]
In this study, improved regression and classification rates for support vector machines are derived based on the assumption of low-dimensional intrinsic structure described by the box-counting dimension. Learning rates are proved under standard regularity assumptions, where the ambient space dimension is replaced by the box-counting dimension of the data generating distribution support. Furthermore, a training validation approach for choosing SVM hyperparameters in a data-dependent manner is shown to achieve the same rates adaptively without prior knowledge of the data generating distribution.
We derive improved regression and classification rates for support vector machines using Gaussian kernels under the assumption that the data has some low-dimensional intrinsic structure that is described by the box-counting dimension. Under some standard regularity assumptions for regression and classification, we prove learning rates, in which the dimension of the ambient space is replaced by the box-counting dimension of the support of the data generating distribution. In the regression case, our rates are in some cases minimax optimal up to logarithmic factors, whereas in the classification case our rates are minimax optimal up to logarithmic factors in a certain range of our assumptions and otherwise of the form of the best known rates. Furthermore, we show that a training validation approach for choosing the hyperparameters of a SVM in a data dependent way achieves the same rates adaptively, that is, without any knowledge on the data generating distribution.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据