4.1 Article

Sparse and debiased lasso estimation and inference for high-dimensional composite quantile regression with distributed data

Related references

Note: Only part of the references are listed.
Article Statistics & Probability

Communication-Efficient Accurate Statistical Estimation

Jianqing Fan et al.

Summary: This article presents two communication-efficient accurate statistical estimators implemented through iterative algorithms for distributed optimization. The algorithms adapt to the similarity among loss functions on node machines and converge rapidly when each node machine has large enough sample size. The proposed method achieves statistical efficiency in finite steps in typical statistical applications.

JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION (2023)

Article Statistics & Probability

Communication-efficient sparse composite quantile regression for distributed data

Yaohong Yang et al.

Summary: The composite quantile regression (CQR) estimator is a robust and efficient alternative in linear models, and can construct sparse CQR estimation. By proposing a penalized communication-efficient surrogate loss function, we only require the worker machines to compute the gradient and the central machine to solve a regular estimation problem to obtain the estimation. The performance of the proposed method is validated through simulation and application to real data set.

METRIKA (2023)

Article Statistics & Probability

Multi-round smoothed composite quantile regression for distributed data

Fengrui Di et al.

Summary: This paper focuses on distributed estimation and inference for composite quantile regression, proposing a multi-round smoothed estimator and showing its efficiency through extensive numerical experiments on simulated and real data.

ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS (2022)

Article Statistics & Probability

High-dimensional quantile regression: Convolution smoothing and concave regularization

Kean Ming Tan et al.

Summary: This paper introduces and studies a convolution-type smoothed QR method with iteratively reweighted l(1) regularization, achieving more accurate estimations in high-dimensional data.

JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY (2022)

Article Economics

Robust post-selection inference of high-dimensional mean regression with heavy-tailed asymmetric or heteroskedastic errors

Dongxiao Han et al.

Summary: We propose a robust post-selection inference method based on the Huber loss for regression coefficients in high-dimensional linear models with heavy-tailed and asymmetric error distributions. The method demonstrates desirable properties and is applicable to scenarios with heteroscedasticity. Simulation studies and an application to genomic data validate its performance.

JOURNAL OF ECONOMETRICS (2022)

Article Mathematical & Computational Biology

Communication-efficient estimation and inference for high-dimensional quantile regression based on smoothed decorrelated score

Fengrui Di et al.

Summary: Distributed estimation based on different sources of observations has attracted attention in modern statistical learning. This article focuses on distributed estimation and inference for a preconceived low-dimensional parameter vector in the high-dimensional quantile regression model with small local sample size. Two communication-efficient estimators are proposed by generalizing the decorrelated score approach and adopting smoothing techniques based on multiround algorithms, with risk bounds and limiting distributions provided. The performance of the proposed estimators is studied through simulations and an application to a gene expression dataset is presented.

STATISTICS IN MEDICINE (2022)

Article Statistics & Probability

2 High-dimensional quantile regression: Convolution smoothing and concave regularization

Kean Ming Tan et al.

Summary: This article proposes and studies a convolution-type smoothed QR with iteratively reweighted l(1) regularization, which achieves the optimal rate of convergence after a few iterations, as well as the oracle rate and the strong oracle property under an almost necessary and sufficient minimum signal strength condition.

JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY (2022)

Article Economics

Smoothing Quantile Regressions

Marcelo Fernandes et al.

Summary: In linear quantile regression, smoothing the objective function leads to better performance in terms of mean squared error and accuracy, as well as the ability to estimate quantile density without being affected by the curse of dimensionality. Additionally, a rule of thumb for choosing the smoothing bandwidth is proposed to approximate the optimal bandwidth effectively. Simulation results confirm the effectiveness of the smoothed quantile regression estimator in finite samples.

JOURNAL OF BUSINESS & ECONOMIC STATISTICS (2021)

Article Computer Science, Interdisciplinary Applications

Robust communication-efficient distributed composite quantile regression and variable selection for massive data

Kangning Wang et al.

Summary: The paper introduces distributed composite quantile regression (CQR) for statistical analysis of massive data. By approximating the global CQR loss function on the first machine, communication costs are reduced and computational burdens are lowered. The new methods exhibit good robustness and efficiency in real data analysis.

COMPUTATIONAL STATISTICS & DATA ANALYSIS (2021)

Article Computer Science, Artificial Intelligence

Smoothing quantile regression for a distributed system

Rong Jiang et al.

Summary: Quantile regression is a popular alternative to least squares regression, providing a comprehensive description of the response distribution and robustness against heavy-tailed error distributions. The proposed distributed estimators are both computationally and communication efficient, and theoretically shown to be as efficient as global estimators after a certain number of iterations without restrictions on the number of machines.

NEUROCOMPUTING (2021)

Article Mathematics, Applied

Communication-efficient estimation of high-dimensional quantile regression

Lei Wang et al.

ANALYSIS AND APPLICATIONS (2020)

Article Computer Science, Information Systems

Sparse Composite Quantile Regression in Ultrahigh Dimensions With Tuning Parameter Calibration

Yuwen Gu et al.

IEEE TRANSACTIONS ON INFORMATION THEORY (2020)

Article Statistics & Probability

Communication-Efficient Distributed Statistical Inference

Michael I. Jordan et al.

JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION (2019)

Article Statistics & Probability

DISTRIBUTED INFERENCE FOR QUANTILE REGRESSION PROCESSES

Stanislav Volgushev et al.

ANNALS OF STATISTICS (2019)

Article Statistics & Probability

QUANTILE REGRESSION UNDER MEMORY CONSTRAINT

Xi Chen et al.

ANNALS OF STATISTICS (2019)

Article Statistics & Probability

DISTRIBUTED TESTING AND ESTIMATION UNDER SPARSE HIGH DIMENSIONAL MODELS

Heather Battey et al.

ANNALS OF STATISTICS (2018)

Article Statistics & Probability

Composite quantile regression for massive datasets

Rong Jiang et al.

STATISTICS (2018)

Article Statistics & Probability

A GENERAL THEORY OF HYPOTHESIS TESTS AND CONFIDENCE REGIONS FOR SPARSE HIGH DIMENSIONAL MODELS

Yang Ning et al.

ANNALS OF STATISTICS (2017)

Article Statistics & Probability

ON ASYMPTOTICALLY OPTIMAL CONFIDENCE REGIONS AND TESTS FOR HIGH-DIMENSIONAL MODELS

Sara Van de Geer et al.

ANNALS OF STATISTICS (2014)

Article Statistics & Probability

Confidence intervals for low dimensional parameters in high dimensional linear models

Cun-Hui Zhang et al.

JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY (2014)

Article Statistics & Probability

l1-PENALIZED QUANTILE REGRESSION IN HIGH-DIMENSIONAL SPARSE MODELS

Alexandre Belloni et al.

ANNALS OF STATISTICS (2011)

Article Statistics & Probability

Composite quantile regression and the oracle model selection theory

Hui Zou et al.

ANNALS OF STATISTICS (2008)