期刊
KNOWLEDGE AND INFORMATION SYSTEMS
卷 63, 期 5, 页码 1149-1172出版社
SPRINGER LONDON LTD
DOI: 10.1007/s10115-021-01554-8
关键词
Robust regression; Support vector machine; Extreme learning machine; Iteratively reweighted least squares
资金
- National Nature Science Foundation of China [11471010, 11271367]
A novel kernel-based regression method with better noise robustness is proposed in this paper, utilizing the l(s)-loss function instead of the traditional l(2)-loss. The theoretical verification and experimental results confirm the good performance of this method.
Least squares kernel-based methods have been widely used in regression problems due to the simple implementation and good generalization performance. Among them, least squares support vector regression (LS-SVR) and extreme learning machine (ELM) are popular techniques. However, the noise sensitivity is a major bottleneck. To address this issue, a generalized loss function, called l(s)-loss, is proposed in this paper. With the support of novel loss function, two kernel-based regressors are constructed by replacing the l(2)-loss in LS-SVR and ELM with the proposed l(s)-loss for better noise robustness. Important properties of l(s)-loss, including robustness, asymmetry and asymptotic approximation behaviors, are verified theoretically. Moreover, iteratively reweighted least squares are utilized to optimize and interpret the proposed methods from a weighted viewpoint. The convergence of the proposal is proved, and detailed analyses of robustness are given. Experiments on both artificial and benchmark datasets confirm the validity of the proposed methods.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据