4.6 Article

A storage-efficient ensemble classification using filter sharing on binarized convolutional neural networks

Journal

PEERJ COMPUTER SCIENCE
Volume 8, Issue -, Pages -

Publisher

PEERJ INC
DOI: 10.7717/peerj-cs.924

Keywords

Binarized neural network; Convolutional neural network; Image classification; Ensemble-based system

Funding

  1. IC Design Education Center (IDEC), Korea
  2. Ministry of Science and ICT
  3. NIPA
  4. National Research Foundation of Korea (NRF) - Korea government (MSIT) [2021R1F1A1048054]
  5. National Research Foundation of Korea [2021R1F1A1048054] Funding Source: Korea Institute of Science & Technology Information (KISTI), National Science & Technology Information Service (NTIS)

Ask authors/readers for more resources

This paper proposes a storage-efficient ensemble classification method to improve the classification accuracy of binary neural networks by sharing filters from a trained convolutional neural network model. Experimental results show that the method demonstrates high scalability and effectiveness on CIFAR datasets.
This paper proposes a storage-efficient ensemble classification to overcome the low inference accuracy of binary neural networks (BNNs). When external power is enough in a dynamic powered system, classification results can be enhanced by aggregating outputs of multiple BNN classifiers. However, memory requirements for storing multiple classifiers are a significant burden in the lightweight system. The proposed scheme shares the filters from a trained convolutional neural network (CNN) model to reduce storage requirements in the binarized CNNs instead of adopting the fully independent classifier. While several filters are shared, the proposed method only trains unfrozen learnable parameters in the retraining step. We compare and analyze the performances of the proposed ensemble-based systems depending on various ensemble types and BNN structures on CIFAR datasets. Our experiments conclude that the proposed method using the filter sharing can be scalable with the number of classifiers and effective in enhancing classification accuracy. With binarized ResNet-20 and ReActNet-10 on the CIFAR-100 dataset, the proposed scheme can achieve 56.74% and 70.29% Top-1 accuracies with 10 BNN classifiers, which enhances performance by 7.6% and 3.6% compared with that using a single BNN classifier.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available