2,089 research outputs found

    DeepWheat: Estimating Phenotypic Traits from Crop Images with Deep Learning

    Full text link
    In this paper, we investigate estimating emergence and biomass traits from color images and elevation maps of wheat field plots. We employ a state-of-the-art deconvolutional network for segmentation and convolutional architectures, with residual and Inception-like layers, to estimate traits via high dimensional nonlinear regression. Evaluation was performed on two different species of wheat, grown in field plots for an experimental plant breeding study. Our framework achieves satisfactory performance with mean and standard deviation of absolute difference of 1.05 and 1.40 counts for emergence and 1.45 and 2.05 for biomass estimation. Our results for counting wheat plants from field images are better than the accuracy reported for the similar, but arguably less difficult, task of counting leaves from indoor images of rosette plants. Our results for biomass estimation, even with a very small dataset, improve upon all previously proposed approaches in the literature.Comment: WACV 2018 (Code repository: https://github.com/p2irc/deepwheat_WACV-2018

    Digital phenotyping and genotype-to-phenotype (G2P) models to predict complex traits in cereal crops

    Get PDF
    The revolution in digital phenotyping combined with the new layers of omics and envirotyping tools offers great promise to improve selection and accelerate genetic gains for crop improvement. This chapter examines the latest methods involving digital phenotyping tools to predict complex traits in cereals crops. The chapter has two parts. In the first part, entitled “Digital phenotyping as a tool to support breeding programs”, the secondary phenotypes measured by high-throughput plant phenotyping that are potentially useful for breeding are reviewed. In the second part, “Implementing complex G2P models in breeding programs”, the integration of data from digital phenotyping into genotype to phenotype (G2P) models to improve the prediction of complex traits using genomic information is discussed. The current status of statistical models to incorporate secondary traits in univariate and multivariate models, as well as how to better handle longitudinal (for example light interception, biomass accumulation, canopy height) traits, is reviewe

    Assessing Within-Field Variation in Alfalfa Leaf Area Index Using UAV Visible Vegetation Indices

    Get PDF
    This study examines the use of leaf area index (LAI) to inform variable-rate irrigation (VRI) for irrigated alfalfa (Medicago sativa). LAI is useful for predicting zone-specific evapotranspiration (ETc). One approach toward estimating LAI is to utilize the relationship between LAI and visible vegetation indices (VVIs) using unmanned aerial vehicle (UAV) imagery. This research has three objectives: (1) to measure and describe the within-field variation in LAI and canopy height for an irrigated alfalfa field, (2) to evaluate the relationships between the alfalfa LAI and various VVIs with and without field average canopy height, and (3) to use UAV images and field average canopy height to describe the within-field variation in LAI and the potential application to VRI. The study was conducted in 2021–2022 in Rexburg, Idaho. Over the course of the study, the measured LAI varied from 0.23 m2 m−2 to 11.28 m2 m−2 and canopy height varied from 6 cm to 65 cm. There was strong spatial clustering in the measured LAI but the spatial patterns were dynamic between dates. Among eleven VVIs evaluated, the four that combined green and red wavelengths but excluded blue wavelengths showed the most promise. For all VVIs, adding average canopy height to multiple linear regression improved LAI prediction. The regression model using the modified green–red vegetation index (MGRVI) and canopy height (R2 = 0.93) was applied to describe the spatial variation in the LAI among VRI zones. There were significant (p \u3c 0.05) but not practical differences

    Detection of homogeneous wheat areas using multi-temporal UAS images and ground truth data analyzed by cluster analysis

    Get PDF
    Vegetation indices (VIs) obtained from unmanned aerial system (UAS) are effective for monitoring quantitative and qualitative characteristics of vegetation cover. Nevertheless, the identification of agronomic homogeneous crop areas to be managed in a specific different agronomic way is still to be improved in precision farming. The aim of the study was to reduce information gap on the detection of homogeneous wheat areas using multi-temporal remote sensing image and agronomic crop traits by cluster analysis. The images were acquired by an eBee UAS on a small plot of 720 m2 at three different crop growth stages High-resolution orthoimages (3.5 cm·pixel−1) were generated by Pix4D and QGIS. At each growth stage, biometric ground-truth data and VIs (NDVI and SAVI) were clustered to detect homogeneous crop areas. At tillering and anthesis stages, three significant homogeneous areas with low (L) medium (M) and high (H) VIs and agronomical values were identified for both indices. Yield-related traits (at harvest) and VIs (at anthesis) confirmed that L and M areas, with agronomic constraints identified at anthesis, showed crop yield losses at harvest. Cluster analysis, using UAS and ground truth data, has proved to be a good strategy to identify the homogeneous wheat crop areas

    Wheat Height Estimation Using LiDAR in Comparison to Ultrasonic Sensor and UAS

    Get PDF
    As one of the key crop traits, plant height is traditionally evaluated manually, which can be slow, laborious and prone to error. Rapid development of remote and proximal sensing technologies in recent years allows plant height to be estimated in more objective and efficient fashions, while research regarding direct comparisons between different height measurement methods seems to be lagging. In this study, a ground-based multi-sensor phenotyping system equipped with ultrasonic sensors and light detection and ranging (LiDAR) was developed. Canopy heights of 100 wheat plots were estimated five times during a season by the ground phenotyping system and an unmanned aircraft system (UAS), and the results were compared to manual measurements. Overall, LiDAR provided the best results, with a root-mean-square error (RMSE) of 0.05 m and an R2 of 0.97. UAS obtained reasonable results with an RMSE of 0.09 m and an R2 of 0.91. Ultrasonic sensors did not perform well due to our static measurement style. In conclusion, we suggest LiDAR and UAS are reliable alternative methods for wheat height evaluation

    Wheat Height Estimation Using LiDAR in Comparison to Ultrasonic Sensor and UAS

    Get PDF
    As one of the key crop traits, plant height is traditionally evaluated manually, which can be slow, laborious and prone to error. Rapid development of remote and proximal sensing technologies in recent years allows plant height to be estimated in more objective and efficient fashions, while research regarding direct comparisons between different height measurement methods seems to be lagging. In this study, a ground-based multi-sensor phenotyping system equipped with ultrasonic sensors and light detection and ranging (LiDAR) was developed. Canopy heights of 100 wheat plots were estimated five times during a season by the ground phenotyping system and an unmanned aircraft system (UAS), and the results were compared to manual measurements. Overall, LiDAR provided the best results, with a root-mean-square error (RMSE) of 0.05 m and an R2 of 0.97. UAS obtained reasonable results with an RMSE of 0.09 m and an R2 of 0.91. Ultrasonic sensors did not perform well due to our static measurement style. In conclusion, we suggest LiDAR and UAS are reliable alternative methods for wheat height evaluation

    Wheat Height Estimation Using LiDAR in Comparison to Ultrasonic Sensor and UAS

    Get PDF
    As one of the key crop traits, plant height is traditionally evaluated manually, which can be slow, laborious and prone to error. Rapid development of remote and proximal sensing technologies in recent years allows plant height to be estimated in more objective and efficient fashions, while research regarding direct comparisons between different height measurement methods seems to be lagging. In this study, a ground-based multi-sensor phenotyping system equipped with ultrasonic sensors and light detection and ranging (LiDAR) was developed. Canopy heights of 100 wheat plots were estimated five times during a season by the ground phenotyping system and an unmanned aircraft system (UAS), and the results were compared to manual measurements. Overall, LiDAR provided the best results, with a root-mean-square error (RMSE) of 0.05 m and an R2 of 0.97. UAS obtained reasonable results with an RMSE of 0.09 m and an R2 of 0.91. Ultrasonic sensors did not perform well due to our static measurement style. In conclusion, we suggest LiDAR and UAS are reliable alternative methods for wheat height evaluation

    Agricultural scene understanding

    Get PDF
    The author has identified the following significant results. The LACIE field measurement data were radiometrically calibrated. Calibration enabled valid comparisons of measurements from different dates, sensors, and/or locations. Thermal band canopy results included: (1) Wind velocity had a significant influence on the overhead radiance temperature and the effect was quantized. Biomass and soil temperatures, temperature gradient, and canopy geometry were altered. (2) Temperature gradient was a function of wind velocity. (3) Temperature gradient of the wheat canopy was relatively constant during the day. (4) The laser technique provided good quality geometric characterization

    Estimation of cotton canopy parameters based on unmanned aerial vehicle (UAV) oblique photography

    Get PDF
    Background: The technology of cotton defoliation is essential for mechanical cotton harvesting. Agricultural unmanned aerial vehicle (UAV) spraying has the advantages of low cost, high efficiency and no mechanical damage to cotton and has been favored and widely used by cotton planters in China. However, there are also some problems of low cotton defoliation rates and high impurity rates caused by unclear spraying amounts of cotton defoliants. The chemical rate recommendation and application should be based upon crop canopy volume rather than on land area. Plant height and leaf area index (LAI) is directly connected to plant canopy structure. Accurate dynamic monitoring of plant height and LAI provides important information for evaluating cotton growth and production. The traditional method to obtain plant height and LAI was s a time-consuming and labor-intensive task. It is very difficult and unrealistic to use the traditional measurement method to make the temporal and spatial variation map of plant height and LAI of large cotton fields. With the application of UAV in agriculture, remote sensing by UAV is currently regarded as an effective technology for monitoring and estimating plant height and LAI. Results: In this paper, we used UAV RGB photos to build dense point clouds to estimate cotton plant height and LAI following cotton defoliant spraying. The results indicate that the proposed method was able to dynamically monitor the changes in the LAI of cotton at different times. At 3 days after defoliant spraying, the correlation between the plant height estimated based on the constructed dense point cloud and the measured plant height was strong, with R2 and RMSE values of 0.962 and 0.913, respectively. At 10 days after defoliant spraying, the correlation became weaker over time, with R2 and RMSE values of 0.018 and 0.027, respectively. Comparing the actual manually measured LAI with the estimated LAI based on the dense point cloud, the R2 and RMSE were 0.872 and 0.814 and 0.132 and 0.173 at 3 and 10 days after defoliant spraying, respectively. Conclusions: Dense point cloud construction based on UAV remote sensing is a potential alternative to plant height and LAI estimation. The accuracy of LAI estimation can be improved by considering both plant height and planting density
    • 

    corecore