期刊
APPLIED SOFT COMPUTING
卷 97, 期 -, 页码 -出版社
ELSEVIER
DOI: 10.1016/j.asoc.2019.105510
关键词
Hyper-heuristics; Meta-heuristics; Deep belief networks; Hyper-parameters; Optimisation
Deep Belief Networks (DBN) have become a powerful tools to deal with a wide range of applications. On complex tasks like image reconstruction, DBN's performance is highly sensitive to parameter settings. Manually trying out different parameters is tedious and time consuming however often required in practice as there are not many better options. This work proposes an evolutionary hyper-heuristic framework for automatic parameter optimisation of DBN. The hyper-heuristic framework introduced here is the first of its kind in this domain. It involves a high level strategy and a pool of evolutionary operators such as crossover and mutation to generates DBN parameter settings by perturbing or modifying the current setting of a DBN. Providing a large set of operators could be beneficial to form a more effective high level strategy, but in the same time would increase the search space hence make it more difficulty to form a good strategy. To address this issue, a non-parametric statistical test is introduced to identify a subset of effective operators for different phases of the hyper-heuristic search. Three well-known image reconstruction datasets were used to evaluate the performance of the proposed framework. The results reveal that the proposed hyper-heuristic framework is very competitive when compared to the state of art methods. (C) 2019 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据