Estimation of maize leaf area index using UAV multimodal data and deep neural network

Authors: Shuaibing Liu*,
Topics: Remote Sensing, Agricultural Geography, UAS / UAV
Keywords: UAV, Maize, LAI, Stepwise regression analysis, Deep neural network
Session Type: Virtual Paper
Day: 4/7/2021
Start / End Time: 8:00 AM / 10:20 AM
Room: Virtual 29
Presentation File: No File Uploaded

The leaf area index (LAI) makes an important contribution to crop phenotypic parameters and is essential for evaluating crop growth. Although the LAI estimation model is designed to use multi-source data from unmanned aerial vehicles (UAVs), few groups have studied the use of multimodal data to estimate the LAI for maize canopies. The goals of this research are thus to (i) determine how multimodal combined with RGB, multispectral and thermal infrared image contribute to the LAI and propose a framework for estimating the LAI, (ii) evaluate the robustness and adaptability of the LAI estimation model, which uses multimodal data and DNNs in single- and whole growth-stages, and (iii) explore how soil background and maize tasseling affects the LAI estimation model. RGB, multispectral, and thermal infrared images were collected by drones to construct multimodal data sets. Next, a deep learning model was developed to estimate the LAI of maize. The DNN model provides the best estimate, R2 = 0.89, rRMSE = 12.92%, for a single growth period and the PLSR model has the best estimate, R2 = 0.70, rRMSE = 12.78%, for an entire growth period. The tassels in the maize canopy reduce the accuracy of the LAI estimation model. However, the soil background provides additional image feature information for the LAI estimation model, which improves the estimation accuracy. These results indicate that multimodal data using UAVs within the DNN framework can accurately and reliably LAI estimates of crops, which is valuable for high-throughput phenotyping and developing high-spatial-precision farmland management.

Abstract Information

This abstract is already part of a session. View the session here.

To access contact information login