4.8 Article

Optimize TSK Fuzzy Systems for Classification Problems: Minibatch Gradient Descent With Uniform Regularization and Batch Normalization

期刊

IEEE TRANSACTIONS ON FUZZY SYSTEMS
卷 28, 期 12, 页码 3065-3075

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TFUZZ.2020.2967282

关键词

Training; Fuzzy systems; Neural networks; Convergence; Optimization; Evolutionary computation; Standards; Batch normalization (BN); minibatch gradient descent; Takagi– Sugeno– Kang (TSK) fuzzy classifier; uniform regularization (UR)

资金

  1. National Natural Science Foundation of China [61873321]
  2. Technology Innovation Project of Hubei Province of China [2019AEA171]

向作者/读者索取更多资源

Takagi-Sugeno-Kang (TSK) fuzzy systems are flexible and interpretable machine learning models; however, they may not be easily optimized when the data size is large, and/or the data dimensionality is high. This article proposes a minibatch gradient descent (MBGD) based algorithm to efficiently and effectively train TSK fuzzy classifiers. It integrates two novel techniques: First, uniform regularization (UR), which forces the rules to have similar average contributions to the output, and hence to increase the generalization performance of the TSK classifier; and, second, batch normalization (BN), which extends BN from deep neural networks to TSK fuzzy classifiers to expedite the convergence and improve the generalization performance. Experiments on 12 UCI datasets from various application domains, with varying size and dimensionality, demonstrated that UR and BN are effective individually, and integrating them can further improve the classification performance.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据