4.7 Article

Deterministic error bounds for kernel-based learning techniques under bounded noise

期刊

AUTOMATICA
卷 134, 期 -, 页码 -

出版社

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.automatica.2021.109896

关键词

Deterministic error bounds; Generalization error; Kernel ridge regression; Support vector machines

资金

  1. Swiss National Science Foundation under the RISK project (Risk Aware Data-Driven Demand Response) [200021 175627]
  2. CSEM's Data Program

向作者/读者索取更多资源

This paper discusses the problem of reconstructing a function from noise-corrupted samples, analyzing two kernel algorithms- kernel ridge regression and epsilon-support vector regression. By establishing finite-sample error bounds and providing numerical examples, the connection between these algorithms and Gaussian processes is explored, aiming to bridge the gap between non-parametric kernel learning and system identification for robust control.
We consider the problem of reconstructing a function from a finite set of noise-corrupted samples. Two kernel algorithms are analyzed, namely kernel ridge regression and epsilon-support vector regression. By assuming the ground-truth function belongs to the reproducing kernel Hilbert space of the chosen kernel, and the measurement noise affecting the dataset is bounded, we adopt an approximation theory viewpoint to establish deterministic, finite-sample error bounds for the two models. Finally, we discuss their connection with Gaussian processes and two numerical examples are provided. In establishing our inequalities, we hope to help bring the fields of non-parametric kernel learning and system identification for robust control closer to each other. (C) 2021 Elsevier Ltd. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据