4.7 Article

OHDA: An Opposition based High Dimensional optimization Algorithm

Journal

APPLIED SOFT COMPUTING
Volume 91, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.asoc.2020.106185

Keywords

High Dimensional optimization; Evolutionary algorithms; Opposition-based learning; Constraint optimization

Ask authors/readers for more resources

One of the challenging problems to-date is to deal with high dimensional data. This problem is getting more severe as the data gathering tools are progressing. This paper proposes an opposition-based optimization algorithm suitable for high dimensions. Its novelty is the angular movement according to a few selected dimensions that makes it effective to search in high dimensions. Accordingly, the Opposition-based High Dimensional optimization Algorithm (OHDA) is proposed. Its performance is studied using functions from CEC2005 for high dimensional data including 1000D and 2000D. In addition, the performance of the proposed algorithm is tested with CEC2014, which is more complicated than CEC2005 and CEC2013. The performance of OHDA is also examined using CEC2017 constraint optimization test suit. The comparing algorithms are CA, ICA, AAA, ABC, KH, MVO, WOA, RW-GWO, B-BBO, LX-BBO and LSHADE44-IEpsilon. The results verify that the proposed algorithm outperforms some conventional optimization algorithms in terms of their accuracies. The efficiency of employing opposite points in optimization is also validated in this paper. (C) 2020 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available