Journal
CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE
Volume 35, Issue 4, Pages -Publisher
WILEY
DOI: 10.1002/cpe.7533
Keywords
KL estimator; mean squared error; M-estimator; multicollinearity; outlier; ridge regression
Ask authors/readers for more resources
This article proposes a robust version of the Kibria-Lukman estimator (KLE) to address the problems of multicollinearity and outliers in regression models. The performance of the proposed method is evaluated through Monte Carlo simulation and real-life data, demonstrating its superiority.
To circumvent the problem of multicollinearity in regression models, a ridge-type estimator is recently proposed in the literature, which is named as the Kibria-Lukman estimator (KLE). The KLE has better properties than the conventional ridge regression estimator. However, the presence of outliers in the data set may have some adverse effects on the KLE. To address this issue, the present article proposes a robust version of the KLE based on the M-estimator. This article also proposes some robust methods to estimate the shrinkage parameter k. The Monte Carlo simulation study and a real-life data is used to gauge the performance of the proposed methods where the mean squared error is used as the evaluation criterion. The numerical results witness the supremacy of the proposed estimator in the presence of outliers.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available