4.0 Article

Divide and conquer for accelerated failure time model with massive time-to-event data

Publisher

WILEY
DOI: 10.1002/cjs.11725

Keywords

Accelerated failure time model; adaptive LASSO; divide and conquer; oracle property; survival data

Ask authors/readers for more resources

Big data presents both theoretical and computational challenges as well as tremendous opportunities in various fields. In healthcare research, a novel approach called divide-and-conquer (DAC) is developed to handle massive and high-dimensional data. The proposed DAC method performs well and is applied to a large dataset from the Chinese Longitudinal Healthy Longevity Survey.
Big data present new theoretical and computational challenges as well as tremendous opportunities in many fields. In health care research, we develop a novel divide-and-conquer (DAC) approach to deal with massive and right-censored data under the accelerated failure time model, where the sample size is extraordinarily large and the dimension of predictors is large but smaller than the sample size. Specifically, we construct a penalized loss function by approximating the weighted least squares loss function by combining estimation results without penalization from all subsets. The resulting adaptive LASSO penalized DAC estimator enjoys the oracle property. Simulation studies demonstrate that the proposed DAC procedure performs well and also reduces the computation time with satisfactory performance compared with estimation results using the full data. Our proposed DAC approach is applied to a massive dataset from the Chinese Longitudinal Healthy Longevity Survey.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.0
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available