4.5 Article

Deep Neural Network Compression for Plant Disease Recognition

期刊

SYMMETRY-BASEL
卷 13, 期 10, 页码 -

出版社

MDPI
DOI: 10.3390/sym13101769

关键词

deep neural networks; plant disease recognition; network pruning; knowledge distillation; model quantization

资金

  1. Key Research and Development Project of Anhui Province [1804a07020108, 201904a06020056, 202104a06020012]
  2. Independent Project of Anhui Key Laboratory of Smart Agricultural Technology and Equipment [APKLSATE2019X001]
  3. Ministry of Agriculture Agricultural Internet of Things Technology Integration and Application Key Laboratory Open Fund [2016KL05]
  4. Major Science and Technology Special Plan of Anhui Province [17030701049]
  5. Major Project of Natural Science Research in Universities of Anhui Province [KJ2019ZD20]

向作者/读者索取更多资源

The paper proposed a DNN-based compression method to reduce computational burden and compress model size through lightweight fully connected layers, pruning, knowledge distillation, and quantization. The experiment demonstrated that the compressed model can be reduced to 0.04 Mb with an accuracy of 97.09%, proving the effectiveness of knowledge distillation and the efficiency of compressed models over prevalent lightweight models.
Deep neural networks (DNNs) have become the de facto standard for image recognition tasks, and their applications with respect to plant diseases have also obtained remarkable results. However, the large number of parameters and high computational complexities of these network models make them difficult to deploy on farms in remote areas. In this paper, focusing on the problems of resource constraints and plant diseases, we propose a DNN-based compression method. In order to reduce computational burden, this method uses lightweight fully connected layers to accelerate reasoning, pruning to remove redundant parameters and reduce multiply-accumulate operations, knowledge distillation instead of retraining to restore the lost accuracy, and then quantization to compress the size of the model further. After compressing the mainstream VGGNet and AlexNet models, the compressed versions are applied to the Plant Village dataset of plant disease images, and a performance comparison of the models before and after compression is obtained to verify the proposed method. The results show that the model can be compressed to 0.04 Mb with an accuracy of 97.09%. This experiment also proves the effectiveness of knowledge distillation during the pruning process, and compressed models are more efficient than prevalent lightweight models.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据