4.6 Article

Deep learning versus Object-based Image Analysis (OBIA) in weed mapping of UAV imagery

Journal

INTERNATIONAL JOURNAL OF REMOTE SENSING
Volume 41, Issue 9, Pages 3446-3479

Publisher

TAYLOR & FRANCIS LTD
DOI: 10.1080/01431161.2019.1706112

Keywords

-

Funding

  1. G Guangdong Provincial Innovation Team for General Key Technologies in Modern Agricultural Industry [2019KJ133]
  2. Science and Technology Planning Project of Guangdong Province, China [2017A020208046]
  3. leading talents of Guangdong province program [2016LJ06G689]
  4. Science and Technology Planning Project of Guangdong Province [2019B020214003, 2018A050506073, 2016A020210100, 2017B010117010]
  5. 111 Project [D18019]
  6. Key Area Research and Development Planning Project of Guangdong Province [2019B020221001]
  7. National Key Research and Development Plan, China [2016YFD0200700]
  8. Science and Technology Planning Project of Guangzhou city, China [201707010047]

Ask authors/readers for more resources

Rice is the most important food crop in the world, which is meaningful to ensure the quality and quantity of rice production. During the plantation process, weeds are the key factor to influence the rice yields. In recent years, the chemical control becomes the most widely used means to control the weed infestation because of its advantage in pesticide effects and efficiency. However, excessive use of herbicides has caused negative effects on the rice quality as well as the environment. An accurate weed cover map can provide support information for specific site weed management (SSWM) applications, which may well address the problem of traditional chemical controls. In this work, the unmanned aerial vehicle (UAV) imagery was captured on four different dates over two different rice fields. Object-based image analysis (OBIA) and deep learning approaches were applied to the weed mapping task of the UAV imagery. For the OBIA methods, the multiresolution segmentation and an improved k-means method were applied to segment the imagery into different objects; the colour and texture features were extracted and concatenated into a feature vector; back propagation (BP) neural network, support vector machine (SVM) and random forest were used for classification. After careful hyperparameter optimization and model selection, it was proven that the OBIA method achieved the accuracy of 66.6% mean intersection over union (MIU) on the testing set, and the inference speed is 2343.5 ms for an image sample. For the deep learning approach, the fully convolutional network (FCN) was applied for the pixel-wise classification task; transfer learning was used, and four pretrained convolutional neural networks (AlexNet, VGGNet, GoogLeNet, and ResNet) were transferred to our dataset via fine-tuning technique. Traditional skip architecture and fully connected conditional random fields (CRF) were used to improve the spatial details of FCN; after that, this work proposed to use a partially connected CRF as post processing, which may significantly accelerate the inference speed of fully connected CRF. Besides one single improvement method, hybrid improvement methods were applied and tested. Experimental results showed that the VGGNet-based FCN achieved the highest accuracy; for the improvement methods, the skip architecture and newly proposed partially connected CRF effectively improved the accuracy, and the hybrid improvement method (skip architecture and partially connected CRF) further improved the performance. The hybrid improvement method achieved 80.2% MIU on the testing set, and the inference speed for an image sample is 326.8 ms. The experimental results of this work demonstrated that the UAV remote-sensing utilizing deep learning method can provide reliable support information for SSWM applications in rice fields.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available