4.6 Article

Evolutionary bagging for ensemble learning

Journal

NEUROCOMPUTING
Volume 510, Issue -, Pages 1-14

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2022.08.055

Keywords

Ensemble learning; Bagging; Random forest; Evolutionary algorithms

Ask authors/readers for more resources

In this paper, an evolutionary bagged ensemble learning method is proposed, which enhances the diversity of bags using evolutionary algorithms. The experimental results show that this method outperforms traditional ensemble learning methods on various benchmark datasets.
Ensemble learning has gained success in machine learning with major advantages over other learning methods. Bagging is a prominent ensemble learning method that creates subgroups of data, known as bags, that are trained by individual machine learning methods such as decision trees. Random forest is a prominent example of bagging with additional features in the learning process. Evolutionary algorithms have been prominent for optimisation problems and also been used for machine learning. Evolutionary algorithms are gradient-free methods that work with a population of candidate solutions that maintain diversity for creating new solutions. In conventional bagged ensemble learning, the bags are created once and the content, in terms of the training examples, are fixed over the learning process. In our paper, we propose evolutionary bagged ensemble learning, where we utilise evolutionary algorithms to evolve the content of the bags in order to iteratively enhance the ensemble by providing diversity in the bags. The results show that our evolutionary ensemble bagging method outperforms conventional ensemble meth-ods (bagging and random forests) for several benchmark datasets under certain constraints. We find that evolutionary bagging can inherently sustain a diverse set of bags without reduction in performance accuracy.(C) 2022 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available