9 research outputs found

    Machine Learning Based High-Throughput Phenotyping Framework for Crop Yield Prediction Using Unmanned Aircraft Systems

    No full text
    Estimating crop yield is essential to ensure agricultural stability, economic viability, and global food security. Provided with accurate crop yield estimation before harvest, farmers, breeders, and agriculture researchers can perform crop evaluation, genotype selection, and maximize yield by timely intervention. Remote sensing is often used to provide information about important canopy state variables for crop yield estimation. However, until recently, a critical bottleneck in such research was the lack of high-throughput sensing technologies for effective and rapid evaluation of expressed phenotypes under field conditions for holistic data-driven decision making. Recent years have witnessed enormous growth in the application of unmanned aircraft systems (UAS) for precision agriculture. UAS has the potential to provide information on crops quantitatively and, above all, nondestructively. This dissertation aims at utilizing UAS data to develop a machine learning based high-throughput phenotyping framework for crop yield estimation. In this research, plant parameters such as canopy height (CH), canopy cover (CC), canopy volume (CV), normalized difference vegetation index (NDVI), and excessive greenness index (ExG) were extracted from fine spatial resolution UAS based RGB and multispectral images collected weekly throughout the growing season. Initially, a comparative study was conducted to compare two management practices in cotton: conventional tillage (CT) and no-tillage (NT). This initial study was designed to test the reliability of the UAS derived plant parameters, and results revealed a significant difference in cotton growth under CT and NT. Unlike manual measurements, which rely on limited samples, UAS technology provided the capability to exploit the entire population, which makes UAS derived data more robust and reliable. Additionally, an inter-comparison study was designed to compare CC derived from RGB and multispectral data over multiple flights during the growing season of the cotton crop. This study demonstrated that using a morphological closing operation after the thresholding significantly improved the RGB-based CC modeling. A CC model that uses a multispectral sensor is considered more stable and accurate in the literature (Roth and Streit, 2018; Xu et al., 2019). In contrast, the RGB-based CC model is unstable and fails to identify canopy pixels when cotton leaves change color after canopy maturation. The proposed RGB-based CC model provides an affordable alternative to the multispectral sensors that are more sensitive and expensive. After assessing the reliability of UAS derived canopy parameters, a novel machine learning framework was developed for cotton yield estimation using multi-temporal UAS data. The proposed machine learning model takes three types of crop features derived from UAS data to predict the yield. The three types of crop features are multi-temporal canopy features, nontemporal features (cotton boll count, boll size, boll volume), and irrigation status. The developed model provided a high coefficient of determination (R2 ~ 0.9). Additionally, redundant features were removed using correlation analysis, and the relative significance of each input feature was determined using sensitivity analysis. Finally, an experiment was performed to investigate how early the model can accurately predict yield. It was observed that even at 70 days after planting, the model predicted yield with reasonable accuracy (R2of 0.71 over test set)

    A Comparative Study of RGB and Multispectral Sensor-Based Cotton Canopy Cover Modelling Using Multi-Temporal UAS Data

    No full text
    This study presents a comparative study of multispectral and RGB (red, green, and blue) sensor-based cotton canopy cover modelling using multi-temporal unmanned aircraft systems (UAS) imagery. Additionally, a canopy cover model using an RGB sensor is proposed that combines an RGB-based vegetation index with morphological closing. The field experiment was established in 2017 and 2018, where the whole study area was divided into approximately 1 x 1 m size grids. Grid-wise percentage canopy cover was computed using both RGB and multispectral sensors over multiple flights during the growing season of the cotton crop. Initially, the normalized difference vegetation index (NDVI)-based canopy cover was estimated, and this was used as a reference for the comparison with RGB-based canopy cover estimations. To test the maximum achievable performance of RGB-based canopy cover estimation, a pixel-wise classification method was implemented. Later, four RGB-based canopy cover estimation methods were implemented using RGB images, namely Canopeo, the excessive greenness index, the modified red green vegetation index and the red green blue vegetation index. The performance of RGB-based canopy cover estimation was evaluated using NDVI-based canopy cover estimation. The multispectral sensor-based canopy cover model was considered to be a more stable and accurately estimating canopy cover model, whereas the RGB-based canopy cover model was very unstable and failed to identify canopy when cotton leaves changed color after canopy maturation. The application of a morphological closing operation after the thresholding significantly improved the RGB-based canopy cover modeling. The red green blue vegetation index turned out to be the most efficient vegetation index to extract canopy cover with very low average root mean square error (2.94% for the 2017 dataset and 2.82% for the 2018 dataset), with respect to multispectral sensor-based canopy cover estimation. The proposed canopy cover model provides an affordable alternate of the multispectral sensors which are more sensitive and expensive

    Plant Counting of Cotton from UAS Imagery Using Deep Learning-Based Object Detection Framework

    No full text
    Assessing plant population of cotton is important to make replanting decisions in low plant density areas, prone to yielding penalties. Since the measurement of plant population in the field is labor intensive and subject to error, in this study, a new approach of image-based plant counting is proposed, using unmanned aircraft systems (UAS; DJI Mavic 2 Pro, Shenzhen, China) data. The previously developed image-based techniques required a priori information of geometry or statistical characteristics of plant canopy features, while also limiting the versatility of the methods in variable field conditions. In this regard, a deep learning-based plant counting algorithm was proposed to reduce the number of input variables, and to remove requirements for acquiring geometric or statistical information. The object detection model named You Only Look Once version 3 (YOLOv3) and photogrammetry were utilized to separate, locate, and count cotton plants in the seedling stage. The proposed algorithm was tested with four different UAS datasets, containing variability in plant size, overall illumination, and background brightness. Root mean square error (RMSE) and R2 values of the optimal plant count results ranged from 0.50 to 0.60 plants per linear meter of row (number of plants within 1 m distance along the planting row direction) and 0.96 to 0.97, respectively. The object detection algorithm, trained with variable plant size, ground wetness, and lighting conditions generally resulted in a lower detection error, unless an observable difference of developmental stages of cotton existed. The proposed plant counting algorithm performed well with 0–14 plants per linear meter of row, when cotton plants are generally separable in the seedling stage. This study is expected to provide an automated methodology for in situ evaluation of plant emergence using UAS data

    Tar Spot Disease Quantification Using Unmanned Aircraft Systems (UAS) Data

    No full text
    Tar spot is a foliar disease of corn characterized by fungal fruiting bodies that resemble tar spots. The disease emerged in the U.S. in 2015, and severe outbreaks in 2018 caused an economic impact on corn yields throughout the Midwest. Adequate epidemiological surveillance and disease quantification are necessary to develop immediate and long-term management strategies. This study presents a measurement framework that evaluates the disease severity of tar spot using unmanned aircraft systems (UAS)-based plant phenotyping and regression techniques. UAS-based plant phenotypic information, such as canopy cover, canopy volume, and vegetation indices, were used as explanatory variables. Visual estimations of disease severity were performed by expert plant pathologists per experiment plot basis and used as response variables. Three regression methods, namely ordinary least squares (OLS), support vector regression (SVR), and multilayer perceptron (MLP), were used to determine an optimal regression method for UAS-based tar spot measurement. The cross-validation results showed that the regression model based on MLP provides the highest accuracy of disease measurements. By training and testing the model with spatially separated datasets, the proposed regression model achieved a Linā€™s concordance correlation coefficient (Ļc) of 0.82 and a root mean square error (RMSE) of 6.42. This study demonstrated that we could use the proposed UAS-based method for the disease quantification of tar spot, which shows a gradual spectral response as the disease develops
    corecore