4.7 Article

Modeling of winter wheat fAPAR by integrating Unmanned Aircraft Vehicle-based optical, structural and thermal measurement

出版社

ELSEVIER
DOI: 10.1016/j.jag.2021.102407

关键词

Winter wheat; fAPAR; Saturation problem; Multi-source remote sensing fusion; Machine learning; Smart agriculture; UAV

资金

  1. Fundamental Research Funds for the Central Universities [2662019PY057, 2662017QD038]
  2. HZAU research startup fund [11041810340, 11041810341]
  3. National Natural Science Foundation of China [31601251]

向作者/读者索取更多资源

The study investigates the effectiveness of optical, structural, and thermal features in modeling winter wheat fAPAR using multisource remote sensing data fusion. Results show that multi-source data fusion integrating optical, structural, and thermal features provides the best model for winter wheat fAPAR mapping, with RGB imagery-derived optical and structural features playing important roles in modeling fAPAR. The thermal feature alone performs the worst among all experiments, but can complement other types of remote sensing features within the framework of data fusion to improve modeling accuracy.
The fraction of absorbed photosynthetically active radiation (fAPAR) is a critical biophysical parameter for crop growth monitoring and yield estimation. Remote sensing provides an efficient way for measuring fAPAR over large areas, compared with the time-consuming and labor-intensive field measurements. However, the optical remote sensing signals usually saturate over dense vegetation (e.g., Leaf Area Index (LAI) > 5 or fAPAR > 0.7), limiting the performance of optical remote sensing in modeling fAPAR. Multi-source remote sensing data fusion has proven to be a feasible method to overcome the saturation problem of optical remote sensing in vegetation monitoring, but little is known about the performance of optical, structural and thermal features fusion for modeling winter wheat fAPAR. Also, the modeling powers of optical, structural, and thermal features for fAPAR estimation have seldom been compared. To fill in these knowledge gaps, the very high spatial resolution RGBoptical and thermal imagery collected by Unmanned Aircraft Vehicle (UAV) were used to quantify the powers of RGB-derived vegetation indices (VIs), Structural Indices (SIs, crop height/canopy cover), and Canopy Temperature (CT) and their combinations in modeling winter wheat fAPAR in this study. The modeling powers of different remote sensing features were compared with the commonly used hyperspectral vegetation indices (HVIs) from field spectrometer measurements. Results showed that (1) multi-source data fusion that integrates optical, structural, and thermal features provided the best model in winter wheat fAPAR mapping (R2 = 0.907 and RMSE = 0.041) ; (2) the RGB imagery-derived optical (i.e., RGB VIs) and structural features (i.e., RGB SIs) were important preditors for winter wheat fAPAR modeling, and their combination can steadily improve the modeling accuracy (-2% improvement in R2 compared to optical-only model); (3) the thermal feature alone performed the worst among all experiments, but it still can complement other types of remote sensing features (i. e., RGB VIs&SIs) and further improve the modeling accuracy within the framework of data fusion (-3% improvement in R2 compared to optical-only model). In general, this study indicates that the framework of multisource remote sensing data fusion can provide more accurate, efficient measurements of winter wheat fAPAR for crop management in precision agriculture, which can help improve resource utilization efficiency (e.g., determine where and when to apply nitrogen fertilizer) and ensure food security in the face of climate change.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据