4.5 Article

Deep Neural Network Compression for Plant Disease Recognition

Journal

SYMMETRY-BASEL
Volume 13, Issue 10, Pages -

Publisher

MDPI
DOI: 10.3390/sym13101769

Keywords

deep neural networks; plant disease recognition; network pruning; knowledge distillation; model quantization

Funding

  1. Key Research and Development Project of Anhui Province [1804a07020108, 201904a06020056, 202104a06020012]
  2. Independent Project of Anhui Key Laboratory of Smart Agricultural Technology and Equipment [APKLSATE2019X001]
  3. Ministry of Agriculture Agricultural Internet of Things Technology Integration and Application Key Laboratory Open Fund [2016KL05]
  4. Major Science and Technology Special Plan of Anhui Province [17030701049]
  5. Major Project of Natural Science Research in Universities of Anhui Province [KJ2019ZD20]

Ask authors/readers for more resources

The paper proposed a DNN-based compression method to reduce computational burden and compress model size through lightweight fully connected layers, pruning, knowledge distillation, and quantization. The experiment demonstrated that the compressed model can be reduced to 0.04 Mb with an accuracy of 97.09%, proving the effectiveness of knowledge distillation and the efficiency of compressed models over prevalent lightweight models.
Deep neural networks (DNNs) have become the de facto standard for image recognition tasks, and their applications with respect to plant diseases have also obtained remarkable results. However, the large number of parameters and high computational complexities of these network models make them difficult to deploy on farms in remote areas. In this paper, focusing on the problems of resource constraints and plant diseases, we propose a DNN-based compression method. In order to reduce computational burden, this method uses lightweight fully connected layers to accelerate reasoning, pruning to remove redundant parameters and reduce multiply-accumulate operations, knowledge distillation instead of retraining to restore the lost accuracy, and then quantization to compress the size of the model further. After compressing the mainstream VGGNet and AlexNet models, the compressed versions are applied to the Plant Village dataset of plant disease images, and a performance comparison of the models before and after compression is obtained to verify the proposed method. The results show that the model can be compressed to 0.04 Mb with an accuracy of 97.09%. This experiment also proves the effectiveness of knowledge distillation during the pruning process, and compressed models are more efficient than prevalent lightweight models.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available