Traditional plant breeding evaluation methods are time-consuming, labor-intensive, and costly. Accurate and rapid phenotypic trait data acquisition and analysis can improve genomic selection and ...accelerate cultivar development. In this work, a technique for data acquisition and image processing was developed utilizing small unmanned aerial vehicles (UAVs), multispectral imaging, and deep learning convolutional neural networks to evaluate phenotypic characteristics on citrus crops. This low-cost and automated high-throughput phenotyping technique utilizes artificial intelligence (AI) and machine learning (ML) to: (i) detect, count, and geolocate trees and tree gaps; (ii) categorize trees based on their canopy size; (iii) develop individual tree health indices; and (iv) evaluate citrus varieties and rootstocks. The proposed remote sensing technique was able to detect and count citrus trees in a grove of 4,931 trees, with precision and recall of 99.9% and 99.7%, respectively, estimate their canopy size with overall accuracy of 85.5%, and detect, count, and geolocate tree gaps with a precision and recall of 100% and 94.6%, respectively. This UAV-based technique provides a consistent, more direct, cost-effective, and rapid method to evaluate phenotypic characteristics of citrus varieties and rootstocks.
•A smart sprayer for precision pest management was developed.•This low-cost technology can distinguish target weeds from non-target objects.•It sprays only on a selected target (weed).•Neural ...networks were utilizing for target detection and classification.•It can significantly reduce the quantity of agrochemicals applied.
Most conventional sprayers apply agrochemicals uniformly, despite the fact that distribution of weeds is typically patchy, resulting in wastage of valuable compounds, increased costs, crop damage risk, pest resistance to chemicals, environmental pollution and contamination of products. To reduce these negative impacts, a smart sprayer was designed and developed utilizing machine vision and artificial intelligence to distinguish target weeds from non-target objects (e.g. vegetable crops) and precisely spray on the desired target/location. Two different experimental scenarios were designed to simulate a vegetable field and to evaluate the smart sprayer’s performance. The first scenario contained artificial weeds (targets) and artificial plants (non-targets). The second and most challenging scenario contained real plants; portulaca weeds as targets, and sedge weeds and pepper plants as non-targets. Two different embedded graphics processing unit (GPU) were evaluated as the smart sprayer processing unit (for image processing and target detection). The more powerful GPU (NVIDIA GTX 1070 Ti) achieved an overall precision of 71% and recall of 78% (for plant detection and target spraying accuracy) on the most challenging scenario with real plants, and 91% accuracy and recall for the first scenario with artificial plants. The less powerful GPU (NVIDIA Jetson TX2) achieved an overall precision and recall of 90% and 89% respectively on the first scenario with artificial plants, and 59% and 44% respectively on the second scenario with real plants. Finally, an RTK GPS was connected to the smart sprayer and an algorithm was developed to automatically generate weed maps and visualize the collected data (after every application). This smart technology integrates a state of the art (AI-based) weed detection system, a novel fast and precision spraying system, and a weed mapping system. It can significantly reduce the quantity of agrochemicals required, especially compared with traditional broadcast sprayers that usually treat the entire field, resulting in unnecessary application to areas that do not require treatment. It could also reduce costs, risk of crop damage and excess herbicide residue, as well as potentially reduce environmental impact.
A remote sensing technique was developed to detect citrus canker in laboratory conditions and was verified in the grove by utilizing an unmanned aerial vehicle (UAV). In the laboratory, a ...hyperspectral (400–1000 nm) imaging system was utilized for the detection of citrus canker in several disease development stages (i.e., asymptomatic, early, and late symptoms) on Sugar Belle leaves and immature (green) fruit by using two classification methods: (i) radial basis function (RBF) and (ii) K nearest neighbor (KNN). The same imaging system mounted on an UAV was used to detect citrus canker on tree canopies in the orchard. The overall classification accuracy of the RBF was higher (94%, 96%, and 100%) than the KNN method (94%, 95%, and 96%) for detecting canker in leaves. Among the 31 studied vegetation indices, the water index (WI) and the Modified Chlorophyll Absorption in Reflectance Index (ARI and TCARI 1) more accurately detected canker in laboratory and in orchard conditions, respectively. Immature fruit was not a reliable tissue for early detection of canker. However, the proposed technique successfully distinguished the late stage canker-infected fruit with 92% classification accuracy. The UAV-based technique achieved 100% classification accuracy for identifying healthy and canker-infected trees.
Early and accurate diagnosis is a critical first step in mitigating losses caused by plant diseases. An incorrect diagnosis can lead to improper management decisions, such as selection of the wrong ...chemical application that could potentially result in further reduced crop health and yield. In tomato, initial disease symptoms may be similar even if caused by different pathogens, for example early lesions of target spot (TS) caused by the fungus
Corynespora cassicola
and bacterial spot (BS) caused by
Xanthomonas perforans
. In this study, hyperspectral imaging (380–1020 nm) was utilized in laboratory and field (collected by an unmanned aerial vehicle; UAV) settings to detect both diseases. Tomato leaves were classified into four categories: healthy, asymptomatic, early and late disease development stages. Thirty-five spectral vegetation indices (VIs) were calculated to select an optimum set of indices for disease detection and identification. Two classification methods were utilized: (i) multilayer perceptron neural network (MLP), and (ii) stepwise discriminant analysis (STDA). Best wavebands selection was considered in blue (408–420 nm), red (630–650 nm) and red edge (730–750 nm). The most significant VIs that could distinguish between healthy leaves and diseased leaves were the photochemical reflectance index (PRI) for both diseases, the normalized difference vegetation index (NDVI850) for BS in all stages, and the triangular vegetation index (TVI), NDVI850 and chlorophyll index green (Chl green) for TS asymptomatic, TS early and TS late disease stage respectively. The MLP classification method had an accuracy of 99%, for both BS and TS, under field (UAV-based) and laboratory conditions.
Tomato crops are susceptible to multiple diseases, several of which may be present during the same season. Therefore, rapid disease identification could enhance crop management consequently ...increasing the yield. In this study, nondestructive methods were developed to detect diseases that affect tomato crops, such as bacterial spot (BS), target spot (TS), and tomato yellow leaf curl (TYLC) for two varieties of tomato (susceptible and tolerant to TYLC only) by using hyperspectral sensing in two conditions: a) laboratory (benchtop scanning), and b) in field using an unmanned aerial vehicle (UAV-based). The stepwise discriminant analysis (STDA) and the radial basis function were applied to classify the infected plants and distinguish them from noninfected or healthy (H) plants. Multiple vegetation indices (VIs) and the M statistic method were utilized to distinguish and classify the diseased plants. In general, the classification results between healthy and diseased plants were highly accurate for all diseases; for instance, when comparing H vs. BS, TS, and TYLC in the asymptomatic stage and laboratory conditions, the classification rates were 94%, 95%, and 100%, respectively. Similarly, in the symptomatic stage, the classification rates between healthy and infected plants were 98% for BS, and 99–100% for TS and TYLC diseases. The classification results in the field conditions also showed high values of 98%, 96%, and 100%, for BS, TS, and TYLC, respectively. The VIs that could best identify these diseases were the renormalized difference vegetation index (RDVI), and the modified triangular vegetation index 1 (MTVI 1) in both laboratory and field. The results were promising and suggest the possibility to identify these diseases using remote sensing.
We have developed a vision-based program to detect symptoms of Olive Quick Decline Syndrome (OQDS) on leaves of
L. infected by
, named X-FIDO (
FastIdiosa Detector for
L.). Previous work predicted ...disease from leaf images with deep learning but required a vast amount of data which was obtained via crowd sourcing such as the PlantVillage project. This approach has limited applicability when samples need to be tested with traditional methods (i.e., PCR) to avoid incorrect training input or for quarantine pests which manipulation is restricted. In this paper, we demonstrate that transfer learning can be leveraged when it is not possible to collect thousands of new leaf images. Transfer learning is the re-application of an already trained deep learner to a new problem. We present a novel algorithm for fusing data at different levels of abstraction to improve performance of the system. The algorithm discovers low-level features from raw data to automatically detect veins and colors that lead to symptomatic leaves. The experiment included images of 100 healthy leaves, 99
-positive leaves and 100
-negative leaves with symptoms related to other stress factors (i.e., abiotic factors such as water stress or others diseases). The program detects OQDS with a true positive rate of 98.60 ± 1.47% in testing, showing great potential for image analysis for this disease. Results were obtained with a convolutional neural network trained with the stochastic gradient descent method, and ten trials with a 75/25 split of training and testing data. This work shows potential for massive screening of plants with reduced diagnosis time and cost.
Nutrient assessment of plants, a key aspect of agricultural crop management and varietal development programs, traditionally is time demanding and labor-intensive. This study proposes a novel ...methodology to determine leaf nutrient concentrations of citrus trees by using unmanned aerial vehicle (UAV) multispectral imagery and artificial intelligence (AI). The study was conducted in four different citrus field trials, located in Highlands County and in Polk County, Florida, USA. In each location, trials contained either ‘Hamlin’ or ‘Valencia’ sweet orange scion grafted on more than 30 different rootstocks. Leaves were collected and analyzed in the laboratory to determine macro- and micronutrient concentration using traditional chemical methods. Spectral data from tree canopies were obtained in five different bands (red, green, blue, red edge and near-infrared wavelengths) using a UAV equipped with a multispectral camera. The estimation model was developed using a gradient boosting regression tree and evaluated using several metrics including mean absolute percentage error (MAPE), root mean square error, MAPE-coefficient of variance (CV) ratio and difference plot. This novel model determined macronutrients (nitrogen, phosphorus, potassium, magnesium, calcium and sulfur) with high precision (less than 9% and 17% average error for the ‘Hamlin’ and ‘Valencia’ trials, respectively) and micro-nutrients with moderate precision (less than 16% and 30% average error for ‘Hamlin’ and ‘Valencia’ trials, respectively). Overall, this UAV- and AI-based methodology was efficient to determine nutrient concentrations and generate nutrient maps in commercial citrus orchards and could be applied to other crop species.
Remote sensing and machine learning (ML) could assist and support growers, stakeholders, and plant pathologists determine plant diseases resulting from viral, bacterial, and fungal infections. ...Spectral vegetation indices (VIs) have shown to be helpful for the indirect detection of plant diseases. The purpose of this study was to utilize ML models and identify VIs for the detection of downy mildew (DM) disease in watermelon in several disease severity (DS) stages, including low, medium (levels 1 and 2), high, and very high. Hyperspectral images of leaves were collected in the laboratory by a benchtop system (380-1,000 nm) and in the field by a UAV-based imaging system (380-1,000 nm). Two classification methods, multilayer perceptron (MLP) and decision tree (DT), were implemented to distinguish between healthy and DM-affected plants. The best classification rates were recorded by the MLP method; however, only 62.3% accuracy was observed at low disease severity. The classification accuracy increased when the disease severity increased (e.g., 86-90% for the laboratory analysis and 69-91% for the field analysis). The best wavelengths to differentiate between the DS stages were selected in the band of 531 nm, and 700-900 nm. The most significant VIs for DS detection were the chlorophyll green (Cl green), photochemical reflectance index (PRI), normalized phaeophytinization index (NPQI) for laboratory analysis, and the ratio analysis of reflectance spectral chlorophyll-a, b, and c (RARSa, RASRb, and RARSc) and the Cl green in the field analysis. Spectral VIs and ML could enhance disease detection and monitoring for precision agriculture applications.
Technological advances in computer vision, mechatronics, artificial intelligence and machine learning have enabled the development and implementation of remote sensing technologies for ...plant/weed/pest/disease identification and management. They provide a unique opportunity for developing intelligent agricultural systems for precision applications. Herein, the Artificial Intelligence (AI) and Machine Learning concepts are described, and several examples are presented to demonstrate the application of the AI in agriculture.
Available on EDIS at: https://edis.ifas.ufl.edu/ae529