Crop aboveground biomass (AGB) is one of the most important indicators in crop breeding and crop management, and can be used for crop yield prediction. A number of vegetation indices (VIs) have been ...proposed to estimate crop biomass, but they perform poorly at high biomass levels and are easily affected by background materials. Texture analysis has been proved to be an efficient approach in forest biomass estimation, but has never been applied to crops with low-altitude unmanned aerial vehicle (UAV) images. The objective of this study was to improve rice AGB estimation by combining textural and spectral analysis of UAV imagery. A two-year rice experiment was conducted in 2015 and 2016, involving different nitrogen (N) rates, planting densities and rice cultivars with three replicates. A six-band multispectral (MS) camera was mounted on a UAV to acquire rice canopy images at critical stages during the rice growing seasons and concurrent field samplings were taken. Simple regression and stepwise multiple linear regression models were developed between biomass data from the two-year experiment and image parameters derived from four different types of feature sets. These features represented commonly used VIs, texture parameters, normalization of texture measurements (normalized difference texture index, NDTI) and combinations of VIs and NDTIs. Finally, all the regression models were evaluated by cross-validation over pooled data with the coefficient of determination (R
2
) and the root mean square error (RMSE). Results demonstrated that the optimized soil adjusted vegetation index (OSAVI) exhibited the best relationship with AGB for the whole season (R
2
= 0.63) and post-heading stages (R
2
= 0.65). Red-edge-based indices yielded best performance (R
2
> 0.70) only for the growth stages before heading. The texture measurement
mean
(MEA) from the NIR band was the best among the eight candidates in AGB estimation. Texture index (NDTI (MEA
800
, MEA
550
)) was superior to all the evaluated VIs in estimating AGB for the whole season (R
2
= 0.75) and pre-heading stages (R
2
= 0.84). Further improvement was obtained across the whole season by combining NDTIs and VIs through a multiple linear regression. This multivariate model produced the highest estimation accuracy for all stages (R
2
= 0.78 and RMSE = 1.84 t ha
−1
) and different stage groups (R
2
= 0.84 and RMSE = 1.06 t ha
−1
for pre-heading stages and R
2
= 0.65 and RMSE = 1.94 t ha
−1
for post-heading stages). The findings imply that the integration of textural information with spectral information significantly improves the accuracy for rice biomass estimation compared to the use of spectral information alone.
Plant nitrogen concentration (PNC) is a critical indicator of N status for crops, and can be used for N nutrition diagnosis and management. This work aims to explore the potential of multispectral ...imagery from unmanned aerial vehicle (UAV) for PNC estimation and improve the estimation accuracy with hyperspectral data collected in the field with a hyperspectral radiometer. In this study we combined selected vegetation indices (VIs) and texture information to estimate PNC in rice. The VIs were calculated from ground and aerial platforms and the texture information was obtained from UAV-based multispectral imagery. Two consecutive years (2015 & 2016) of experiments were conducted, involving different N rates, planting densities and rice cultivars. Both UAV flights and ground spectral measurements were taken along with destructive samplings at critical growth stages of rice (
L.). After UAV imagery preprocessing, both VIs and texture measurements were calculated. Then the optimal normalized difference texture index (NDTI) from UAV imagery was determined for separated stage groups and the entire season. Results demonstrated that aerial VIs performed well only for pre-heading stages (
= 0.52-0.70), and photochemical reflectance index and blue N index from ground (PRI
and BNI
) performed consistently well across all growth stages (
= 0.48-0.65 and 0.39-0.68). Most texture measurements were weakly related to PNC, but the optimal NDTIs could explain 61 and 51% variability of PNC for separated stage groups and entire season, respectively. Moreover, stepwise multiple linear regression (SMLR) models combining aerial VIs and NDTIs did not significantly improve the accuracy of PNC estimation, while models composed of BNI
and optimal NDTIs exhibited significant improvement for PNC estimation across all growth stages. Therefore, the integration of ground-based narrow band spectral indices with UAV-based textural information might be a promising technique in crop growth monitoring.
Unmanned aerial system (UAS)-based remote sensing is one promising technique for precision crop management, but few studies have reported the applications of such systems on nitrogen (N) estimation ...with multiple sensors in rice (Oryza sativa L.). This study aims to evaluate three sensors (RGB, color-infrared (CIR) and multispectral (MS) cameras) onboard UAS for the estimation of N status at individual stages and their combination with the field data collected from a two-year rice experiment. The experiments were conducted in 2015 and 2016, involving different N rates, planting densities and rice cultivars, with three replicates. An Oktokopter UAS was used to acquire aerial photography at early growth stages (from tillering to booting) and field samplings were taken at a near date. Two color indices (normalized excess green index (NExG), and normalized green red difference index (NGRDI)), two near infrared vegetation indices (green normalized difference vegetation index (GNDVI), and enhanced NDVI (ENDVI)) and two red edge vegetation indices (red edge chlorophyll index (CIred edge), and DATT) were used to evaluate the capability of these three sensors in estimating leaf nitrogen accumulation (LNA) and plant nitrogen accumulation (PNA) in rice. The results demonstrated that the red edge vegetation indices derived from MS images produced the highest estimation accuracy for LNA (R2: 0.79–0.81, root mean squared error (RMSE): 1.43–1.45 g m−2) and PNA (R2: 0.81–0.84, RMSE: 2.27–2.38 g m−2). The GNDVI from CIR images yielded a moderate estimation accuracy with an all-stage model. Color indices from RGB images exhibited satisfactory performance for the pooled dataset of the tillering and jointing stages. Compared with the counterpart indices from the RGB and CIR images, the indices from the MS images performed better in most cases. These results may set strong foundations for the development of UAS-based rice growth monitoring systems, providing useful information for the real-time decision making on crop N management.
Leaf area index (LAI) is a fundamental indicator of plant growth status in agronomic and environmental studies. Due to rapid advances in unmanned aerial vehicle (UAV) and sensor technologies, ...UAV-based remote sensing is emerging as a promising solution for monitoring crop LAI with great flexibility and applicability. This study aimed to determine the feasibility of combining color and texture information derived from UAV-based digital images for estimating LAI of rice (Oryza sativa L.). Rice field trials were conducted at two sites using different nitrogen application rates, varieties, and transplanting methods during 2016 to 2017. Digital images were collected using a consumer-grade UAV after sampling at key growth stages of tillering, stem elongation, panicle initiation and booting. Vegetation color indices (CIs) and grey level co-occurrence matrix-based textures were extracted from mosaicked UAV ortho-images for each plot. As a solution of using indices composed by two different textures, normalized difference texture indices (NDTIs) were calculated by two randomly selected textures. The relationships between rice LAIs and each calculated index were then compared using simple linear regression. Multivariate regression models with different input sets were further used to test the potential of combining CIs with various textures for rice LAI estimation. The results revealed that the visible atmospherically resistant index (VARI) based on three visible bands and the NDTI based on the mean textures derived from the red and green bands were the best for LAI retrieval in the CI and NDTI groups, respectively. Independent accuracy assessment showed that random forest (RF) exhibited the best predictive performance when combining CI and texture inputs (R2 = 0.84, RMSE = 0.87, MAE = 0.69). This study introduces a promising solution of combining color indices and textures from UAV-based digital imagery for rice LAI estimation. Future studies are needed on finding the best operation mode, suitable ground resolution, and optimal predictive methods for practical applications.
Unmanned aerial vehicle (UAV)-based remote sensing (RS) possesses the significant advantage of being able to efficiently collect images for precision agricultural applications. Although numerous ...methods have been proposed to monitor crop nitrogen (N) status in recent decades, just how to utilize an appropriate modeling algorithm to estimate crop leaf N content (LNC) remains poorly understood, especially based on UAV multispectral imagery. A comparative assessment of different modeling algorithms (i.e., simple and non-parametric modeling algorithms alongside the physical model retrieval method) for winter wheat LNC estimation is presented in this study. Experiments were conducted over two consecutive years and involved different winter wheat varieties, N rates, and planting densities. A five-band multispectral camera (i.e., 490 nm, 550 nm, 671 nm, 700 nm, and 800 nm) was mounted on a UAV to acquire canopy images across five critical growth stages. The results of this study showed that the best-performing vegetation index (VI) was the modified renormalized difference VI (RDVI), which had a determination coefficient (R2) of 0.73 and a root mean square error (RMSE) of 0.38. This method was also characterized by a high processing speed (0.03 s) for model calibration and validation. Among the 13 non-parametric modeling algorithms evaluated here, the random forest (RF) approach performed best, characterized by R2 and RMSE values of 0.79 and 0.33, respectively. This method also had the advantage of full optical spectrum utilization and enabled flexible, non-linear fitting with a fast processing speed (2.3 s). Compared to the other two methods assessed here, the use of a look up table (LUT)-based radiative transfer model (RTM) remained challenging with regard to LNC estimation because of low prediction accuracy (i.e., an R2 value of 0.62 and an RMSE value of 0.46) and slow processing speed. The RF approach is a fast and accurate technique for N estimation based on UAV multispectral imagery.
•We automatically generated representative training samples of winter wheat.•We proposed a novel method named AGTOC for winter wheat mapping on GEE.•Winter wheat map was accurately produced by AGTOC ...with the OA of 92.91%.•Winter wheat map could be updated annually without the need of new training samples.
Accurate and timely acquisition of crop spatial distribution is a prerequisite for growth monitoring and yield forecasting. Currently, the automatic acquisition of crop distribution at large scales is still a challenge due to the time-consuming processing of remotely sensed imagery and manual collection of sufficient training samples. Although the advent of cloud computing platforms has proved to improve the efficiency and automation of crop type classification, how to obtain sufficient training samples in an efficient and cost-effective way remains unclear. In this research, we developed a new approach integrating the automatic generation of training samples and one-class machine learning classification (AGTOC) for mapping winter wheat over Jiangsu Province, China on Google Earth Engine (GEE). After extracting spatial objects from Sentinel-2 imagery in the season of 2017–2018, this method performed recognition of winter wheat objects based on the unique phenology and spectral features of winter wheat. Then the generated winter wheat objects were further refined and regarded as training samples for provincial winter wheat classification with one-class support vector machine (OCSVM). Furthermore, the transferability of AGTOC was evaluated by applying the classification approach to different seasons (2016–2017 & 2019–2020) and a different sensor (Landsat-8 OLI). According to independent ground truth data, the winter wheat mapping with AGTOC achieved an overall accuracy (OA) of 92.61%. When compared with agricultural census data, the winter wheat area accounted for 99% and 90% of the variability at the municipal and county levels. Furthermore, the OA achieved 88.94% and 90.17% while transferring the AGTOC from 2017–2018 to 2016–2017 and 2019–2020. The transferability of the AGTOC model to Landsat-8 OLI imagery of the same season yielded an OA of 85.98%. These results demonstrated that AGTOC exhibited high efficiency and accuracy across the province, different seasons and sensors without the need of extensive field visits for training sample collection. This proposed approach has great potential in the automatic mapping of winter wheat on GEE at country or global levels.
•Boundary delineation methods are scarce for crop fields with area <1 ha each.•We developed DESTIN for delineating field boundaries from high resolution imagery.•Temporal features were derived from ...Planet separately at soil preparation & harvest.•Crop field objects were recognized by DESIN at overall accuracies from 95% to 99%.•DESTIN yielded better matches of field boundaries with reference than ENVI FX.
The digital boundaries of crop fields represent a prerequisite for designing parcel-based crop management platforms, implementing online site-specific agronomic practices and monitoring crop growth per field. Previous approaches on field boundary delineation were mostly developed with medium resolution imagery (e.g., Landsat) for the regions or countries with intensive agriculture and large-sized crop fields. However, suitable delineation methods are scarce for the regions in developing countries where the majority of arable land is cultivated by smallholder farmers and distributed in small and fragmented crop fields. This study proposed a comprehensive method, delineation by fusing spatial and temporal information (DESTIN), to derive the boundaries of crop fields from sub-meter WorldView-2/3 and 3-m Planet imagery. After extraction of spatial objects from very high resolution (VHR) WorldView imagery, this method performed recognition of crop field objects using high resolution (HR) Planet-derived temporal features specifically concerning soil preparation and harvesting stages for summer crops. The performance of DESTIN in crop field boundary delineation was evaluated with the reference polygons (0.4–1.0 ha in area on average) over four subset areas in eastern China’s Jiangsu province, and further compared with a benchmark objection extraction approach.
The results demonstrated that the integration of WorldView and Planet imagery as demanded by DESTIN yielded accurate recognition of crop fields with the classification overall accuracy (OA) ranging from 94.98% to 98.84%, which was remarkably improved over the use of WorldView or Planet imagery alone with increases in OA from 12% to 17%. The majority of crop field boudaries were successfully delineated with both methods, but DESTIN produced cleaner polygons than the benchmark appraoch and closer matches of field boundaries to the reference. DESTIN also yielded better one-to-one matches between delineations and reference (77% as opposed to 54%) and fewer one-to-many matches (1% as opposed to 33%) as a reflection of being less prone to over-segmentation. The DESTIN method does not need subjective parameterization for image segmentation, and could be applicable to the areas with availability of bi-temporal VHR imagery over the soil preparation and harvesting stages and HR imagery over the peak growth stage of summer crops. It has great potential for delineating the crop field boundaries in smallholder farming systems with VHR imagery acquired from satellite, airborne or unmanned aerial vehicle platforms.
Timely monitoring nitrogen status of rice crops with remote sensing can help us optimize nitrogen fertilizer management and reduce environmental pollution. Recently, the use of near-surface imaging ...spectroscopy is emerging as a promising technology that can collect hyperspectral images with spatial resolutions ranging from millimeters to decimeters. The spatial resolution is crucial for the efficiency in the image sampling across rice plants and the separation of leaf signals from the background. However, the optimal spatial resolution of such images for monitoring the leaf nitrogen concentration (LNC) in rice crops remains unclear. To assess the impact of spatial resolution on the estimation of rice LNC, we collected ground-based hyperspectral images throughout the entire growing season over 2 consecutive years and generated ten sets of images with spatial resolutions ranging from 1.3 to 450 mm. These images were used to determine the sensitivity of LNC prediction to spatial resolution with three groups of vegetation indices (VIs) and two multivariate methods Gaussian Process regression (GPR) and Partial least squares regression (PLSR). The reflectance spectra of sunlit-, shaded-, and all-leaf leaf pixels separated from background pixels at each spatial resolution were used to predict LNC with VIs, GPR and PLSR, respectively. The results demonstrated all-leaf pixels generally exhibited more stable performance than sunlit- and shaded-leaf pixels regardless of estimation approaches. The predictions of LNC required stage-specific LNC~VI models for each vegetative stage but could be performed with a single model for all the reproductive stages. Specifically, most VIs achieved stable performances from all the resolutions finer than 14 mm for the early tillering stage but from all the resolutions finer than 56 mm for the other stages. In contrast, the global models for the prediction of LNC across the entire growing season were successfully established with the approaches of GPR or PLSR. In particular, GPR generally exhibited the best prediction of LNC with the optimal spatial resolution being found at 28 mm. These findings represent significant advances in the application of ground-based imaging spectroscopy as a promising approach to crop monitoring and understanding the effects of spatial resolution on the estimation of rice LNC.
This paper evaluates the potential of integrating textural and spectral information from unmanned aerial vehicle (UAV)-based multispectral imagery for improving the quantification of nitrogen (N) ...status in rice crops. Vegetation indices (VIs), normalized difference texture indices (NDTIs), and their combination were used to estimate four N nutrition parameters leaf nitrogen concentration (LNC), leaf nitrogen accumulation (LNA), plant nitrogen concentration (PNC), and plant nitrogen accumulation (PNA). Results demonstrated that the normalized difference red-edge index (NDRE) performed best in estimating the N nutrition parameters among all the VI candidates. The optimal texture indices had comparable performance in N nutrition parameters estimation as compared to NDRE. Significant improvement for all N nutrition parameters could be obtained by integrating VIs with NDTIs using multiple linear regression. While tested across years and growth stages, the multivariate models also exhibited satisfactory estimation accuracy. For texture analysis, texture metrics calculated in the direction D3 (perpendicular to the row orientation) are recommended for monitoring row-planted crops. These findings indicate that the addition of textural information derived from UAV multispectral imagery could reduce the effects of background materials and saturation and enhance the N signals of rice canopies for the entire season.
•We coupled TLS with LME modeling to estimate organ and aboveground biomass of rice.•LME generally outperformed SMLR and RF for biomass variables over independent data.•LME modeling produced a ...remarkable improvement for the estimation of panicle biomass.•TLS and LME modeling can overcome the optical saturation at post-heading stages.•TLS is useful for predicting crop grain yield accurately from heading to harvest.
Non-destructive and accurate estimation of crop biomass is crucial for the quantitative diagnosis of growth status and timely prediction of grain yield. As an active remote sensing technique, terrestrial laser scanning (TLS) has become increasingly available in crop monitoring for its advantages in recording structural properties. Some researchers have attempted to use TLS data in the estimation of crop aboveground biomass, but only for part of the growing season. Previous studies rarely investigated the estimation of biomass for individual organs, such as the panicles in rice canopies, which led to the poor understanding of TLS technology in monitoring biomass partitioning among organs. The objective of this study was to investigate the potential of TLS in estimating the biomass for individual organs and aboveground biomass of rice and to examine the feasibility of developing universal models for the entire growing season. The field plots experiments were conducted in 2017 and 2018 and involved different nitrogen (N) rates, planting techniques and rice varieties. Three regression approaches, stepwise multiple linear regression (SMLR), random forest regression (RF) and linear mixed-effects (LME) modeling, were evaluated in estimating biomass with extensive TLS and biomass data collected at multiple phenological stages of rice growth across the entire season. The models were calibrated with the 2017 dataset and validated independently with the 2018 dataset.
The results demonstrated that growth stage in LME modeling was selected as the most significant random effect on rice growth among the three candidates, which were rice variety, growth stage and planting technique. The LME models grouped by growth stage exhibited higher validation accuracies for all biomass variables over the entire season to varying degrees than SMLR models and RF models. The most pronounced improvement with a LME model was obtained for panicle biomass, with an increase of 0.74 in R2 (LME: R2 = 0.90, SMLR: R2 = 0.16) and a decrease of 1.15 t/ha in RMSE (LME: RMSE =0.79 t/ha, SMLR: RMSE =2.94 t/ha). Compared to SMLR and RF, LME modeling yielded similar estimation accuracies of aboveground biomass for pre-heading stages, but significantly higher accuracies for post-heading stages (LME: R2 = 0.63, RMSE =2.27 t/ha; SMLR: R2 = 0.42, RMSE =2.42 t/ha; RF: R2 = 0.57, RMSE =2.80 t/ha). These findings implied that SMLR was only suitable for the estimation of biomass at pre-heading stages and LME modeling performed remarkably well across all growth stages, especially for post-heading. The results suggest coupling TLS with LME modeling is a promising approach to monitoring rice biomass at post-heading stages at high accuracy and to overcoming the saturation of canopy reflectance signals encountered in optical remote sensing. It also has great potential in the monitoring of other crops in cloud-cover conditions and the instantaneous prediction of grain yield any time before harvest.