期刊
COMPUTATIONAL STATISTICS & DATA ANALYSIS
卷 152, 期 -, 页码 -出版社
ELSEVIER
DOI: 10.1016/j.csda.2020.107039
关键词
Varying coefficient models; Sparsity; Structure learning; High dimensions; Reproducing kernel Hilbert space (RKHS)
资金
- National Natural Science Foundation of China [11871277, 11829101]
- JSPS Kakenhi [18H03201, 18K19793, 20H00576]
- Japan Digital Design
- JST-CREST
Partially varying coefficient model (PVCM) provides a useful class of tools for modeling complex data by incorporating a combination of constant and time-varying covariate effects. One natural question is that how to decide which covariates correspond to constant coefficients and which correspond to time-dependent coefficient functions. To handle this two-type structure selection problem on PVCM, those existing methods are either based on a finite truncation way of coefficient functions, or based on a two-phase procedure to estimate the constant and function parts separately. This paper attempts to provide a complete theoretical characterization for estimation and structure selection issues of PVCM, via proposing two new penalized methods for PVCM within a reproducing kernel Hilbert space (RKHS). The proposed strategy is partially motivated by the so-called Non-Constant Theorem of radial kernels, which ensures a unique and unified representation of each candidate component in the hypothesis space. Within a high-dimensional framework, minimax convergence rates for the prediction risk of the first method is established when each unknown time-dependent coefficient can be well approximated within a specified RKHS. On the other hand, under certain regularity conditions, it is shown that the second proposed estimator is able to identify the underlying structure correctly with high probability. Several simulated experiments are implemented to examine the finite sample performance of the proposed methods. (C) 2020 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据