4.7 Article

An effective automatic system deployed in agricultural Internet of Things using Multi-Context Fusion Network towards crop disease recognition in the wild

Journal

APPLIED SOFT COMPUTING
Volume 89, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.asoc.2020.106128

Keywords

Multiclass crop disease recognition; Convolutional Neural Network; Internet of Things; Multi-Context Fusion Network; ContextNet

Funding

  1. National Key Technology R&D Program of China [2018YFD0200300]
  2. National Natural Science Foundation of China [31401293, 31671586, 61773360]
  3. Major Special Science and Technology Project of Anhui Province, China [201903a06020006]

Ask authors/readers for more resources

Automatic crop disease recognition in the wild is a challenging topic in modern intelligent agriculture due to the appearance variances and cluttered background among crop diseases. To overcome these obstacles, the popular methods are to design a Convolutional Neural Network (CNN) model that extracts visual features and identifies crop disease images based on these features. These methods work well on laboratory environment under simple background but achieve low accuracy and poor robustness in processing the raw images captured from practical fields that contain inevitable noises. In this case, Internet of Things (IoT) is attracting increasing attention, with many alternatives to collect high-level contextual information that helps modern recognition system to effectively identify crop diseases in the wild. Motivated by the usefulness of agricultural IoT, a deep learning system using a novel approach named Multi-Context Fusion Network (MCFN), is developed to be deployed in agricultural IoT towards practical crop disease recognition in the wild. Our MCFN firstly adopts a standard CNN backbone to extract highly discriminative and robust visual features from over 50,000 in-field crop disease samples. Next, we exploit contextual features collected from image acquisition sensors as prior information to assist crop disease classification and reduce false positives in our presented ContextNet. Finally, a deep fully connected network is designed to fuse visual features as well as contextual features and output the crop disease prediction. Experimental results on 77 common crop diseases captured in our newly built domain specific dataset show that MCFN with the deep fusion model outperforms the state-of-the-art methods in wild crop disease recognition, and achieves a good identification accuracy of 97.5%. (C) 2020 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available