期刊
IEEE TRANSACTIONS ON FUZZY SYSTEMS
卷 28, 期 12, 页码 3065-3075出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TFUZZ.2020.2967282
关键词
Training; Fuzzy systems; Neural networks; Convergence; Optimization; Evolutionary computation; Standards; Batch normalization (BN); minibatch gradient descent; Takagi– Sugeno– Kang (TSK) fuzzy classifier; uniform regularization (UR)
资金
- National Natural Science Foundation of China [61873321]
- Technology Innovation Project of Hubei Province of China [2019AEA171]
Takagi-Sugeno-Kang (TSK) fuzzy systems are flexible and interpretable machine learning models; however, they may not be easily optimized when the data size is large, and/or the data dimensionality is high. This article proposes a minibatch gradient descent (MBGD) based algorithm to efficiently and effectively train TSK fuzzy classifiers. It integrates two novel techniques: First, uniform regularization (UR), which forces the rules to have similar average contributions to the output, and hence to increase the generalization performance of the TSK classifier; and, second, batch normalization (BN), which extends BN from deep neural networks to TSK fuzzy classifiers to expedite the convergence and improve the generalization performance. Experiments on 12 UCI datasets from various application domains, with varying size and dimensionality, demonstrated that UR and BN are effective individually, and integrating them can further improve the classification performance.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据