4.7 Article

Root anatomy based on root cross-section image analysis with deep learning

Journal

COMPUTERS AND ELECTRONICS IN AGRICULTURE
Volume 175, Issue -, Pages -

Publisher

ELSEVIER SCI LTD
DOI: 10.1016/j.compag.2020.105549

Keywords

Image analysis; Deep learning; Object detection; Faster R-CNN; Root anatomy

Ask authors/readers for more resources

Aboveground plant efficiency has improved significantly in recent years, and the improvement has led to a steady increase in global food production. The improvement of belowground plant efficiency has potential to further increase food production. However, belowground plant roots are harder to study, due to inherent challenges presented by root phenotyping. Several tools for identifying root anatomical features in root cross-section images have been proposed. However, existing tools are not fully automated and require significant human effort to produce accurate results. To address this limitation, we use a fully automated approach, specifically, the Faster Region-based Convolutional Neural Network (Faster R-CNN), to identify anatomical traits in root cross-section images. By training Faster R-CNN models on root cross-section images, we can detect objects such as root, stele and late metaxylem, and predict rectangular bounding boxes around such objects. Subsequently, the bounding boxes can be used to estimate the root diameter, stele diameter, late metaxylem number, and average diameter. Experimental evaluation using standard object detection metrics, such as intersection-over-union and mean average precision, has shown that the Faster R-CNN models trained on rice root cross-section images can accurately detect root, stele and late metaxylem objects. Furthermore, the results have shown that the measurements estimated based on predicted bounding boxes have small root mean square error when compared with the corresponding ground truth values, suggesting that Faster R-CNN can be used to accurately detect anatomical features. Finally, a comparison with Mask R-CNN, an instance segmentation approach, has shown that the Faster R-CNN network produces overall better results given a small training set. A webserver for performing root anatomy using the Faster R-CNN models trained on rice images, and a link to a GitHub repository containing a copy of the Faster R-CNN code are made available to the research community. The labeled images used for training and evaluating the Faster R-CNN models are also available from the GitHub repository.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available