4.6 Article

SPMoE: a novel subspace-projected mixture of experts model for multi-target regression problems

期刊

SOFT COMPUTING
卷 20, 期 5, 页码 2047-2065

出版社

SPRINGER
DOI: 10.1007/s00500-015-1623-7

关键词

Mixture of experts; Boosting; Subspace projection; Negative correlation learning; Multi-target regression

向作者/读者索取更多资源

In this paper, we focus on modeling multi-target regression problems with high-dimensional feature spaces and a small number of instances that are common in many real-life problems of predictive modeling. With the aim of designing an accurate prediction tool, we present a novel mixture of experts (MoE) model called subspace-projected MoE (SPMoE). Training the experts of the SPMoE is done using a boosting-like manner by a combination of ideas from subspace projection method and the negative correlation learning algorithm (NCL). Instead of using whole original input space for training the experts, we develop a new cluster-based subspace projection method to obtain projected subspaces focused on the difficult instances at each step of the boosting approach for training the diverse experts. The experts of the SPMoE are trained on the obtained subspaces using a new NCL algorithm called sequential NCL. The SPMoE is compared with the other ensemble models using three real cases of high-dimensional multi-target regression problems; the electrical discharge machining, energy efficiency and an important problem in the field of operations strategy called the practice-performance problem. The experimental results show that the prediction accuracy of the SPMoE is significantly better than the other ensemble and single models and can be considered to be a promising alternative for modeling the high-dimensional multi-target regression problems.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据