4.6 Article

Knowledge distillation in plant disease recognition

Journal

NEURAL COMPUTING & APPLICATIONS
Volume 34, Issue 17, Pages 14287-14296

Publisher

SPRINGER LONDON LTD
DOI: 10.1007/s00521-021-06882-y

Keywords

Plant disease recognition; Deep convolutional neural network; Knowledge distillation; Tiny mobileNet

Ask authors/readers for more resources

This paper proposes a deep learning approach for recognizing plant diseases during their critical period. A client-server system is designed to consider both performance and accuracy, with a novel knowledge distillation technique improving the accuracy of the small client-side model. The experimental results show a significant improvement in the classification rate of the state-of-the-art tiny model by using this teacher-student idea.
Recognizing the plant disease and pests in its golden time is a highly critical problem to be addressed, since the herbalist can apply treatments within this period and save the agricultural product. In this paper, a deep learning approach to recognize the disease from the leaves of the plants has been pursued. A client-server system is proposed in which the server-side model can leverage huge deep CNN architectures to classify the diseases, whereas the client-side model is to be chosen among small deep CNN architectures with low number of parameters in order to be easily deployed on the end-user mobile devices with poor processing powers. Here, a novel knowledge distillation technique has been leveraged that improves the accuracy level of the small client-side model significantly. This technique distills the perception knowledge of a large model classifier and transfers this knowledge to the small model in order to perform a similar prediction capability. By applying this idea on Plantvillage dataset, we could achieve 97:58% accuracy on a small MobileNet architecture which is very close to the accuracy of a large Xception model on the server with 99:73% accuracy. Through applying this teacher-student idea, we could improve the classification rate of the state-of-the-art tiny model by 2:12%.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available