3.8 Proceedings Paper

Enhancing the Learning of Interval Type-2 Fuzzy Classifiers with Knowledge Distillation

Publisher

IEEE
DOI: 10.1109/FUZZ45933.2021.9494471

Keywords

Fuzzy classification; interval type-2 fuzzy sets; fuzzy logic systems; deep learning; knowledge distillation

Funding

  1. Scientific and Technological Research Council of Turkey (TUBITAK) [118E807]

Ask authors/readers for more resources

This study introduces a DL-based approach with knowledge distillation to transfer the generalizability features of deep models into IT2-FLS for enhanced learning performance. By combining DL and IT2-FLS, the large input sizes and high rule counts challenges are addressed, improving the learning of fuzzy logic systems.
Fuzzy Logic Systems (FLSs), especially Interval Type-2 (IT2) ones, are proven to achieve good results in various tasks, including classification problems. However, IT2-FLSs suffer from the curse of dimensionality problem, just like its Type-1 (T1) counterparts, and also training complexity since IT2-FLS have a large number of learnable parameters when compared to T1-FLSs. Deep learning (DL) architectures on the other hand can handle large learnable parameter sets for good generalizability but have their disadvantages. In this study, we present DL based approach with knowledge distillation for IT2-FLSs which transfers the generalizability features of deep models into IT2-FLS and increases its learning performance significantly by eliminating the problems that may arise from large input sizes and high rule counts. We present in detail the proposed approach with parameterization tricks so that the training of IT2-FLS can be accomplished straightforwardly within the widely employed DL frameworks without violating the definitions of IT2-FSs. We present comparative analysis to show the benefits of the inclusion knowledge distillation in the learning of IT2-FLSs with respect to rule number and input dimension size.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available