4.7 Article

Partial least trimmed squares regression

出版社

ELSEVIER
DOI: 10.1016/j.chemolab.2021.104486

关键词

Partial least squares; Least trimmed squares; SIMPLS; Robust PLS

资金

  1. Natural Science Foundation of Zhejiang [LY21C200001]
  2. National Natural Science Foundation of China [62071386, 61805180]

向作者/读者索取更多资源

This paper proposes a robust method for PLS based on the idea of least trimmed squares (LTS), which effectively deals with high-dimensional regressors. By formulating the LTS problem as a concave maximization problem, the complexity of solving LTS is simplified. The results from simulation and real data sets demonstrate the effectiveness and robustness of the proposed approach.
Partial least squares (PLS) regression is a linear regression technique and plays an important role in dealing with high-dimensional regressors. Unfortunately, PLS is sensitive to outliers in datasets and consequentially produces a corrupted model. In this paper, we propose a robust method for PLS based on the idea of least trimmed squares (LTS), in which the objective is to minimize the sum of the smallest h squared residuals. However, solving an LTS problem is generally NP-hard. Inspired by the complementary idea of Sim and Hartley, we solve the inverse of the LTS problem instead and formulate it as a concave maximization problem, which is convex and can be solved in polynomial time. Classic PLS as well as two of the most efficient robust PLS methods, Partial Robust M (PRM) regression and RSIMPLS, are compared in this study. Results of both simulation and real data sets show the effectiveness and robustness of our approach.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据