期刊
INFORMATION SCIENCES
卷 547, 期 -, 页码 244-254出版社
ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2020.08.039
关键词
Dimension reduction; Kernel method; Nonparametric quantile regression; Random projection
资金
- Fundamental Research Funds for the Central Universities of China [JBK2001001, JBK1806002, JBK140507]
- National Social Science Fund of China [17BTJ025]
- Hong Kong RGC general research fund [11301718, 11300519]
- NSFC [11871411]
- City University of Hong Kong Shenzhen Research Institute
Nonparametric quantile regression is commonly used for nonlinear quantile modeling, with a kernel approach in a reproducing kernel Hilbert space framework. To address heavy computational burden with large sample sizes, a random projection approach with m << n is considered. Theoretical results show that sketched KQR achieves minimax convergence rate when m is at least as large as the effective statistical dimension of the problem.
Nonparametric quantile regression is a commonly used nonlinear quantile model. One general and popular approach is based on the use of kernels within a reproducing kernel Hilbert space (RKHS) framework, with the smoothing splines estimation as a special case. However, when the sample size n is large, the computational burden is heavy. Motivated by the recent advances in random projection for kernel nonparametric (mean) ridge regression (KRR), we consider an m-dimensional random projection approach for kernel quantile regression (KQR) with m << n. We establish a theoretical result showing that the sketched KQR still achieves the minimax convergence rate when m is at least as large as the effective statistical dimension of the problem. Some Monte Carlo studies are carried out for illustration purposes. (C) 2020 Elsevier Inc. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据