Regions with excessive cloud cover lead to limited feasibility of applying optical images to monitor crop growth. In this article, we built an upsampling moving window network for regional crop… Click to show full abstract
Regions with excessive cloud cover lead to limited feasibility of applying optical images to monitor crop growth. In this article, we built an upsampling moving window network for regional crop growth monitoring (UMRCGM) model to estimate the two key biophysical parameters (BPs), leaf area index (LAI), and canopy chlorophyll content (CCC) during the main growth period of winter wheat by using Sentinel-1 Synthetic Aperture Radar (SAR) and Sentinel-3 optical images. Sentinel-1 imagery is unaffected by cloudy weather and Sentinel-3 imagery has a wide width and short revisit period, the organic combination of the two will greatly improve the ability to monitor crop growth at a regional scale. The impact of two different types of SAR information (intensity and polarization) on the estimation of the two BPs was further analyzed. The UMRCGM model optimized the correspondence between inputs and outputs, it had more accurate LAI and CCC estimates compared with the three classical machine learning models, and had the highest accuracy at the green-up stage of winter wheat, followed by the jointing stage and the heading-filling stage, and the lowest accuracy was found at the milk maturity stage. The estimation accuracies of CCC were slightly higher than that of LAI for the first three growth stages of winter wheat, while lower than that of LAI for the milk maturity stage. This article proposes a new method for regional BPs (especially for CCC) estimation by combining SAR and optical imagery with large differences in spatial resolution under a deep learning framework.
               
Click one of the above tabs to view related content.