期刊
INFORMATION SCIENCES
卷 547, 期 -, 页码 887-909出版社
ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2020.08.046
关键词
General fuzzy min-max neural network; Novel hyperbox selection; Online learning; Agglomerative learning; Accelerated learning algorithms
资金
- UTS-FEIT
This paper proposes a method to accelerate the training process of general fuzzy min-max neural network by removing hyperboxes that do not satisfy expansion or aggregation conditions, thus reducing training time. Experimental results show a significant decrease in training time for both online and agglomerative learning algorithms using the proposed method.
This paper proposes a method to accelerate the training process of general fuzzy min-max neural network. The purpose is to reduce the unsuitable hyperboxes selected as the potential candidates of the expansion step of existing hyperboxes to cover a new input pattern in the online learning algorithms or candidates of the hyperbox aggregation process in the agglomerative learning algorithms. Our proposed approach is based on the mathematical formulas to form a new solution aiming to remove the hyperboxes which are certain not to satisfy expansion or aggregation conditions, and in turn decreasing the training time of learning algorithms. The efficiency of the proposed method is assessed over a number of widely used data sets. The experimental results indicated the significant decrease in training time of proposed approach for both online and agglomerative learning algorithms. Notably, the training time of the online learning algorithms is reduced from 1.2 to 12 times when using the proposed method, while the agglomerative learning algorithms are accelerated from 7 to 37 times on average. (C) 2020 Elsevier Inc. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据