4.7 Article

Mixture of feature specified experts

期刊

INFORMATION FUSION
卷 20, 期 -, 页码 242-251

出版社

ELSEVIER
DOI: 10.1016/j.inffus.2014.02.006

关键词

Combining classifiers; Diversity; Mixture of experts; BCI; EEG

向作者/读者索取更多资源

Mixture of Experts is one of the most popular ensemble methods in pattern recognition systems. Although, diversity between the experts is one of the necessary conditions for the success of combining methods, ensemble systems based on Mixture of Experts suffer from the lack of enough diversity among the experts caused by unfavorable initial parameters. In the conventional Mixture of Experts, each expert receives the whole feature space. To increase diversity among the experts, solve the structural issues of Mixture of Experts such as zero coefficient problem, and improve efficiency in the system, we intend to propose a model, entitled Mixture of Feature Specified Experts, in which each expert gets a different subset of the original feature set. To this end, we first select a set of feature subsets which lead to a set of diverse and efficient classifiers. Then the initial parameters are infused to the system with training classifiers on the selected feature subsets. Finally, we train the expert and the gating networks using the learning rule of classical Mixture of Experts to organize collaboration between the members of system and aiding the gating network to find the best partitioning of the problem space. To evaluate our proposed method, we have used six datasets from the UCI repository. In addition the generalization capability of our proposed method is considered on real-world database of EEG based Brain-Computer Interface. The performance of our method is evaluated with various appraisal criteria and significant improvement in recognition rate of our proposed method is indicated in all practical tests. (C) 2014 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据