13 research outputs found

    Early-Season Stand Count Determination in Corn via Integration of Imagery from Unmanned Aerial Systems (UAS) and Supervised Learning Techniques

    Get PDF
    Corn (Zea mays L.) is one of the most sensitive crops to planting pattern and early-season uniformity. The most common method to determine number of plants is by visual inspection on the ground but this field activity becomes time-consuming, labor-intensive, biased, and may lead to less profitable decisions by farmers. The objective of this study was to develop a reliable, timely, and unbiased method for counting corn plants based on ultra-high-resolution imagery acquired from unmanned aerial systems (UAS) to automatically scout fields and applied to real field conditions. A ground sampling distance of 2.4 mm was targeted to extract information at a plant-level basis. First, an excess greenness (ExG) index was used to individualized green pixels from the background, then rows and inter-row contours were identified and extracted. A scalable training procedure was implemented using geometric descriptors as inputs of the classifier. Second, a decision tree was implemented and tested using two training modes in each site to expose the workflow to different ground conditions at the time of the aerial data acquisition. Differences in performance were due to training modes and spatial resolutions in the two sites. For an object classification task, an overall accuracy of 0.96, based on the proportion of corrected assessment of corn and non-corn objects, was obtained for local (per-site) classification, and an accuracy of 0.93 was obtained for the combined training modes. For successful model implementation, plants should have between two to three leaves when images are collected (avoiding overlapping between plants). Best workflow performance was reached at 2.4 mm resolution corresponding to 10 m of altitude (lower altitude); higher altitudes were gradually penalized. The latter was coincident with the larger number of detected green objects in the images and the effectiveness of geometry as descriptor for corn plant detection.Sociedad Argentina de Informática e Investigación Operativ

    Early-Season Stand Count Determination in Corn via Integration of Imagery from Unmanned Aerial Systems (UAS) and Supervised Learning Techniques

    Get PDF
    Corn (Zea mays L.) is one of the most sensitive crops to planting pattern and early-season uniformity. The most common method to determine number of plants is by visual inspection on the ground but this field activity becomes time-consuming, labor-intensive, biased, and may lead to less profitable decisions by farmers. The objective of this study was to develop a reliable, timely, and unbiased method for counting corn plants based on ultra-high-resolution imagery acquired from unmanned aerial systems (UAS) to automatically scout fields and applied to real field conditions. A ground sampling distance of 2.4 mm was targeted to extract information at a plant-level basis. First, an excess greenness (ExG) index was used to individualized green pixels from the background, then rows and inter-row contours were identified and extracted. A scalable training procedure was implemented using geometric descriptors as inputs of the classifier. Second, a decision tree was implemented and tested using two training modes in each site to expose the workflow to different ground conditions at the time of the aerial data acquisition. Differences in performance were due to training modes and spatial resolutions in the two sites. For an object classification task, an overall accuracy of 0.96, based on the proportion of corrected assessment of corn and non-corn objects, was obtained for local (per-site) classification, and an accuracy of 0.93 was obtained for the combined training modes. For successful model implementation, plants should have between two to three leaves when images are collected (avoiding overlapping between plants). Best workflow performance was reached at 2.4 mm resolution corresponding to 10 m of altitude (lower altitude); higher altitudes were gradually penalized. The latter was coincident with the larger number of detected green objects in the images and the effectiveness of geometry as descriptor for corn plant detection.Sociedad Argentina de Informática e Investigación Operativ

    Early-Season Stand Count Determination in Corn via Integration of Imagery from Unmanned Aerial Systems (UAS) and Supervised Learning Techniques

    Get PDF
    Corn (Zea mays L.) is one of the most sensitive crops to planting pattern and early-season uniformity. The most common method to determine number of plants is by visual inspection on the ground but this field activity becomes time-consuming, labor-intensive, biased, and may lead to less profitable decisions by farmers. The objective of this study was to develop a reliable, timely, and unbiased method for counting corn plants based on ultra-high-resolution imagery acquired from unmanned aerial systems (UAS) to automatically scout fields and applied to real field conditions. A ground sampling distance of 2.4 mm was targeted to extract information at a plant-level basis. First, an excess greenness (ExG) index was used to individualized green pixels from the background, then rows and inter-row contours were identified and extracted. A scalable training procedure was implemented using geometric descriptors as inputs of the classifier. Second, a decision tree was implemented and tested using two training modes in each site to expose the workflow to different ground conditions at the time of the aerial data acquisition. Differences in performance were due to training modes and spatial resolutions in the two sites. For an object classification task, an overall accuracy of 0.96, based on the proportion of corrected assessment of corn and non-corn objects, was obtained for local (per-site) classification, and an accuracy of 0.93 was obtained for the combined training modes. For successful model implementation, plants should have between two to three leaves when images are collected (avoiding overlapping between plants). Best workflow performance was reached at 2.4 mm resolution corresponding to 10 m of altitude (lower altitude); higher altitudes were gradually penalized. The latter was coincident with the larger number of detected green objects in the images and the effectiveness of geometry as descriptor for corn plant detection.Sociedad Argentina de Informática e Investigación Operativ

    Maize and sorghum plant detection at early growth stages using proximity laser and time-of-flight sensors

    Get PDF
    Maize and sorghum are important cereal crops in the world. To increase the maize grain yield, two approaches are used: exploring hybrid maize in plant breeding and improving the crop management system. Plant population is a parameter for calculating the germination rate, which is an important phenotypic trait of seeds. An automated way to obtain the plant population at early growth stages can help breeders to save measuring time in the field and increase the efficiency of their breeding programs. Similar to what has been taking place in production agriculture, plant scientists and plant breeders have been looking for and adopting precision technologies into their research programs; and analyzing plant performance plot-by-plot and even plant-by-plant is becoming the norm and vitally important plant phenomics research and seed industry. Accurate plant location information is needed for determining plant distribution and generating plant stand maps. Two automated plant population detection and location estimation systems using different sensors were developed in this research. A 2D machine vision technique was applied to develop a real-time automatic plant population estimation and plant stand map generation system for maize and sorghum in early growth stages. Laser sensors were chosen as they are not affected by outdoor lighting conditions. Plant detection algorithms were developed based on the unique plant stem structure. Since maize and sorghum look similar at early growth stages, the system was tested over both plants in greenhouse condition. The detection rate of over 93.1% and 83.0% were achieved for maize and sorghum plants from V2 to V6 growth stage, respectively. The mean absolute error and root-mean-error of plant location were 3.1 cm and 3.2 cm m for maize and 2.8 cm and 2.9 cm for grain sorghum plants, respectively. Apart from using laser sensors, a 3D Time-of-Flight camera-based automatic system was also developed for maize and sorghum plant detection at their early growth stages. The images were captured by using a Swift camera from a side-view of the crop row without any shade during the daytime in a greenhouse. A serious of image processing algorithms including point cloud filtering, plant candidate extraction, invalid plant removal, and plant registration were developed for this system. By comparing with the manual measurement, for the maize plant, the average true positive detection rate was 89% with 0.06 standard deviation. For grain sorghum plants, the average true positive detection rate was 85% with 0.08 standard deviation

    Quantifying corn emergence using UAV imagery and machine learning

    Get PDF
    Corn (Zea mays L.) is one of the important crops in the United States for animal feed, ethanol production, and human consumption. To maximize the final corn yield, one of the critical factors to consider is to improve the corn emergence uniformity temporally (emergence date) and spatially (plant spacing). Conventionally, the assessment of emergence uniformity usually is performed through visual observation by farmers at selected small plots to represent the whole field, but this is limited by time and labor needed. With the advance of unmanned aerial vehicle (UAV)-based imaging technology and advanced image processing techniques powered by machine learning (ML) and deep learning (DL), a more automatic, non-subjective, precise, and accurate field-scale assessment of emergence uniformity becomes possible. Previous studies had demonstrated the success of crop emergence uniformity using UAV imagery, specifically at fields with simple soil background. There is no research having investigated the feasibility of UAV imagery in the corn emergence assessment at fields of conservation agriculture that are covered with cover crops or residues to improve soil health and sustainability. The overall goal of this research was to develop a fast and accurate method for the assessment of corn emergence using UAV imagery, ML and DL techniques. The pertinent information is essential for corn production early and in-season decision making as well as agronomy research. The research comprised three main studies, including Study 1: quantifying corn emergence date using UAV imagery and a ML model; Study 2: estimating corn stand count in different cropping systems (CS) using UAV images and DL; and Study 3: estimating and mapping corn emergence under different planting depths. Two case studies extended Study 3 to field-scale applications by relating emergence uniformity derived from the developed method to planting depths treatments and estimating final yield. For all studies, the primary imagery data were collected using a consumer-grade UAV equipped with a red-green-blue (RGB) camera at a flight height of approximate 10 m above ground level. The imagery data had a ground sampling distance (GSD) of 0.55 - 3.00 mm pixel-1 that was sufficient to detect small size seedlings. In addition, a UAV multispectral camera was used to capture corn plants at early growth stages (V4, V6, and V7) in case studies to extract plant reflectance (vegetation indices, VIs) as plant growth variation indicators. Random forest (RF) ML models were used to classify the corn emergence date based on the days after emergence (DAE) to time of assessment and estimate yield. The DL models, U-Net and ResNet18, were used to segment corn seedlings from UAV images and estimate emergence parameters, including plant density, average DAE (DAEmean), and plant spacing standard deviation (PSstd), respectively. Results from Study 1 indicated that individual corn plant quantification using UAV imagery and a RF ML model achieved moderate classification accuracies of 0.20 - 0.49 that increased to 0.55 - 0.88 when DAE classification was expanded to be within a 3-day window. In Study 2, the precision for image segmentation by the U-Net model was [greater than or equal to] 0.81 for all CS, resulting in high accuracies in estimating plant density (R2 [greater than or equal to] 0.92; RMSE [less than or equal to] 0.48 plants m-1). Then, the ResNet18 model in Study 3 was able to estimate emergence parameters with high accuracies (0.97, 0.95, and 0.73 for plant density, DAEmean, and PSstd, respectively). Case studies showed that crop emergence maps and evaluation in field conditions indicated an expected trend of decreasing plant density and DAEmean with increasing planting depths and opposite results for PSstd. However, mixed trends were found for emergence parameters among planting depths at different replications and across the N-S direction of the fields. For yield estimation, emergence data alone did not show any relation with final yield (R2 = 0.01, RMSE = 720 kg ha-1). The combination of VIs from all the growth stages was only able to estimate yield with R2 of 0.34 and RMSE of 560 kg ha-1. In summary, this research demonstrated the success of UAV imagery and ML/DL techniques in assessing and mapping corn emergence at fields practicing all or some components of conservation agriculture. The findings give more insights for future agronomic and breeding studies in providing field-scale crop emergence evaluations as affected by treatments and management as well as relating emergence assessment to final yield. In addition, these emergence evaluations may be useful for commercial companies when needing justification for developing new technologies relating to precision planting to crop performance. For commercial crop production, more comprehensive emergence maps (in terms of temporal and spatial uniformity) will help to make better replanting or early management decisions. Further enhancement of the methods such as more validation studies in different locations and years as well as development of interactive frameworks will establish a more automatic, robust, precise, accurate, and 'ready-to-use' approach in estimating and mapping crop emergence uniformity.Includes bibliographical references

    Early corn stand count of different cropping systems using UAV-imagery and deep learning

    Get PDF
    Optimum plant stand density and uniformity is vital in order to maximize corn (Zea mays L.) yield potential. Assessment of stand density can occur shortly after seedlings begin to emerge, allowing for timely replant decisions. The conventional methods for evaluating an early plant stand rely on manual measurement and visual observation, which are time consuming, subjective because of the small sampling areas used, and unable to capture field-scale spatial variability. This study aimed to evaluate the feasibility of an unmanned aerial vehicle (UAV)-based imaging system for estimating early corn stand count in three cropping systems (CS) with different tillage and crop rotation practices. A UAV equipped with an on-board RGB camera was used to collect imagery of corn seedlings (~14 days after planting) of CS, i.e., minimum-till corn-soybean rotation (MTCS), no-till corn-soybean rotation (NTCS), and no-till corn-corn rotation with cover crop implementation (NTCC). An image processing workflow based on a deep learning (DL) model, U-Net, was developed for plant segmentation and stand count estimation. Results showed that the DL model performed best in segmenting seedlings in MTCS, followed by NTCS and NTCC. Similarly, accuracy for stand count estimation was highest in MTCS (R2 = 0.95), followed by NTCS (0.94) and NTCC (0.92). Differences by CS were related to amount and distribution of soil surface residue cover, with increasing residue generally reducing the performance of the proposed method in stand count estimation. Thus, the feasibility of using UAV imagery and DL modeling for estimating early corn stand count is qualified influenced by soil and crop management practices

    Estimating corn emergence date using UAV-based imagery

    Get PDF
    Assessing corn (Zea Mays L.) emergence uniformity soon after planting is important for relating to grain production and for making replanting decisions. Unmanned aerial vehicle (UAV) imagery has been used for determining corn densities at vegetative growth stage 2 (V2) and later, but not as a tool for detecting emergence date. The objective of this study was to estimate days after corn emergence (DAE) using UAV imagery. A field experiment was designed with four planting depths to obtain a range of corn emergence dates. UAV imagery was collected during the first, second and third weeks after emergence. Acquisition height was approximately 5m above ground level resulted in a ground sampling distance 1.5 mm pixel-1. Seedling size and shape features derived from UAV imagery were used for DAE classification based on the Random Forest machine learning model. Results showed image features were distinguishable for different DAE (single day) within the first week after initial corn emergence with a moderate overall classification accuracy of 0.49. However, for the second week and beyond the overall classification accuracy diminished (0.20 to 0.35). When estimating DAE within a three-day window (± 1 DAE), overall 3-day classification accuracies ranged from 0.54 to 0.88. Diameter, area, and major axis length/area were important image features to predict corn DAE. Findings demonstrated that UAV imagery can detect newly-emerged corn plants and estimate their emergence date to assist in establishing emergence uniformity. Additional studies are needed for fine-tuning image collection procedures and image feature identification in order to improve accuracy

    Utilizing unmanned aircraft system (UAS) technology to collect early stand counts and to assess early plant vigor for use in early-season stress tolerance characterization of hybrid corn products

    Get PDF
    Early-season stress tolerance characterization of hybrid corn products relies heavily on early stand count and early vigor data from field trials in order to properly characterize products and to accurately assign stress emergence scores. The current manual collections of these data are labor-intensive, time-consuming, prone to human error, and in the case of vigor scoring, subjective. Unmanned aircraft systems (UAS) may provide a more accurate, rapid, objective, and efficient method for collecting stand count and vigor data resulting in higher quality products and overall cost-savings. The purpose of this study was to determine if UAS could be used for stand count and vigor data collection for the early-season stress tolerance characterization of hybrid corn products. The early-season stress tolerance characterization field trial was flown on 12 different dates during the spring of 2017 representing plant growth stages from VE to V5. Stand count and plot cover values were calculated from the UAS obtained images for the 12 flight dates using a 2017 and an updated 2018 software algorithm. It was determined that the best time to collect UAS stand count data occurred at the V2 plant growth stage before leaf overlapping occurred. An UAS derived plot cover normalization method was also developed for assigning plot vigor scores allowing for more objective, reproducible, and unbiased assessments of plot vigor

    Applications of remote sensing in agriculture via unmanned aerial systems and satellites

    Get PDF
    Doctor of PhilosophyDepartment of AgronomyIgnacio CiampittiThe adoption of Remote Sensing (RS) in agriculture have been mainly utilized to inference about biological processes in a scalable manner over space and time. In this context, this work first explores two non-traditional approaches for rapid derivation of plant performance under field conditions. Both approaches focus on plant metrics extraction exploiting high spatial resolution from Unmanned Aerial Systems (UAS). Second, we investigate the spatial-temporal dynamics of corn (Zea mays L.) phenology and yield in the corn belt region utilizing high temporal resolution from satellite. To evaluate the impact of the adoption of RS for deriving plant/crop performance the following objectives were established: i) investigate the implementation of digital aerial photogrammetry to derive plant metrics (plant height and biomass) in corn; ii) implement and test a methodology for detecting and counting corn plants via very high spatial resolution imagery in the context of precision agriculture; iii) derive key phenological metrics of corn via high temporal resolution satellite imagery and identify links between the derived metrics and yield trends over the last 14 years for corn within the corn belt region. For the first objective, main findings indicate that digital aerial photogrammetry can be utilized to derive plant height and assist in plant biomass estimation. Results also suggest that plant biomass predictability significantly increases when integrating the aerial plant height estimate and ground stem diameter. For the second objective, the workflow implemented demostrates adequate performance to detect and count corn plants in the image. Its robustness highly dependends on the spatial resolution of the image, limitations and future research paths are further discussed. Lastly, for the third objective, outcomes evidenced that for a long-term perspective (14 years), an extended reproductive stage significantly correlates with high yield for corn. When considering a shorter-term period (last 4 years) mainly characterized by optimal growth conditions, early season green-up rate and late season senescence rate positively describe yield trend in the region. The significance of the variables changed according to the time-span considered. It is noticed that when optimal growth conditions are met, modern-hybrids can capitalize by increasing yield, due to primarily a faster (green-up) rate before flowering and on senescence rate better describes yield under these conditions. The entire research project investigates opportunities and needs for integrating remote sensing into the agronomic-based inference process

    High-throughput estimation of crop traits: A review of ground and aerial phenotyping platforms

    Get PDF
    Crop yields need to be improved in a sustainable manner to meet the expected worldwide increase in population over the coming decades as well as the effects of anticipated climate change. Recently, genomics-assisted breeding has become a popular approach to food security; in this regard, the crop breeding community must better link the relationships between the phenotype and the genotype. While high-throughput genotyping is feasible at a low cost, highthroughput crop phenotyping methods and data analytical capacities need to be improved. High-throughput phenotyping offers a powerful way to assess particular phenotypes in large-scale experiments, using high-tech sensors, advanced robotics, and imageprocessing systems to monitor and quantify plants in breeding nurseries and field experiments at multiple scales. In addition, new bioinformatics platforms are able to embrace large-scale, multidimensional phenotypic datasets. Through the combined analysis of phenotyping and genotyping data, environmental responses and gene functions can now be dissected at unprecedented resolution. This will aid in finding solutions to currently limited and incremental improvements in crop yields
    corecore