Field-scale yield prediction of winter wheat under different irrigation regimes based on dynamic fusion of multimodal UAV imagery

文献类型: 外文期刊

第一作者: Ma, Juncheng

作者: Ma, Juncheng;Ji, Lin;Zhu, Zhicheng;Wu, Yongfeng;Liu, Binhui;Zhu, Zhicheng;Jiao, Weihua;Wu, Yongfeng

作者机构:

关键词: Yield prediction; Winter wheat; Multimodal UAV imagery; Dynamic fusion; Deep learning

期刊名称:INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION ( 影响因子:7.5; 五年影响因子:7.2 )

ISSN: 1569-8432

年卷期: 2023 年 118 卷

页码:

收录情况: SCI

摘要: Field-scale crop yield prediction is critical to site-specific field management, which has been facilitated by recent studies fusing unmanned aerial vehicles (UAVs) based multimodal data. However, these studies equivalently stacked multimodal data and underused canopy spatial information. In this study, multimodal imagery fusion (MIF) attention was proposed to dynamically fuse UAV-based RGB, hyperspectral near-infrared (HNIR), and thermal imagery. Based on the MIF attention, a novel model termed MultimodalNet was proposed for field-scale yield prediction of winter wheat. To compare multimodal imagery-based and multimodal features-based methods, a stacking-based ensemble learning model was built using UAV-based canopy spectral, thermal, and texture features. The results showed that the MultimodalNet achieved accurate results at the reproductive stage and performed better than any single modality in the fusion. The MultimodalNet performed best at the flowering stage, with a coefficient of determination of 0.7411 and a mean absolute percentage error of 6.05%. The HNIR and thermal imagery were essential in yield prediction of winter wheat at the reproductive stage. Compared to equivalent stacking fusion, dynamic fusion through adaptively adjusting modality attention improved the model accuracy and adaptability across winter wheat cultivars and water treatments. Equivalently stacking more modalities did not necessarily yield improved performance than dynamically fusing fewer modalities. Methods using multimodal UAV imagery with rich spatial information were more applicable than methods using multi -modal features to field-scale yield prediction. This study indicates that the MultimodalNet makes a powerful tool for field-scale yield prediction of winter wheat.

分类号:

  • 相关文献
作者其他论文 更多>>