485 research outputs found

    Plant high-throughput phenotyping using photogrammetry and 3D modeling techniques

    Get PDF
    Doctor of PhilosophyAgronomyKevin PriceStephen M. WelchPlant phenotyping has been studied for decades for understanding the relationship between plant genotype, phenotype, and the surrounding environment. Improved accuracy and efficiency in plant phenotyping is a critical factor in expediting plant breeding and the selection process. In the past, plant phenotypic traits were extracted using invasive and destructive sampling methods and manual measurements, which were time-consuming, labor-intensive, and cost-inefficient. More importantly, the accuracy and consistency of manual methods can be highly variable. In recent years, however, photogrammetry and 3D modeling techniques have been introduced to extract plant phenotypic traits, but no cost-efficient methods using these two techniques have yet been developed for large-scale plant phenotyping studies. High-throughput 3D modeling techniques in plant biology and agriculture are still in the developmental stages, but it is believed that the temporal and spatial resolutions of these systems are well matched to many plant phenotyping needs. Such technology can be used to help rapid phenotypic trait extraction aid crop genotype selection, leading to improvements in crop yield. In this study, we introduce an automated high-throughput phenotyping pipeline using affordable imaging systems, image processing, and 3D reconstruction algorithms to build 2D mosaicked orthophotos and 3D plant models. Chamber-based and ground-level field implementations can be used to measure phenotypic traits such as leaf length, rosette area in 2D and 3D, plant nastic movement, and diurnal cycles. Our automated pipeline has cross-platform capabilities and a degree of instrument independence, making it suitable for various situations

    Low-cost and automated phenotyping system “Phenomenon” for multi-sensor in situ monitoring in plant in vitro culture

    Get PDF
    Background: The current development of sensor technologies towards ever more cost-effective and powerful systems is steadily increasing the application of low-cost sensors in different horticultural sectors. In plant in vitro culture, as a fundamental technique for plant breeding and plant propagation, the majority of evaluation methods to describe the performance of these cultures are based on destructive approaches, limiting data to unique endpoint measurements. Therefore, a non-destructive phenotyping system capable of automated, continuous and objective quantification of in vitro plant traits is desirable. Results: An automated low-cost multi-sensor system acquiring phenotypic data of plant in vitro cultures was developed and evaluated. Unique hardware and software components were selected to construct a xyz-scanning system with an adequate accuracy for consistent data acquisition. Relevant plant growth predictors, such as projected area of explants and average canopy height were determined employing multi-sensory imaging and various developmental processes could be monitored and documented. The validation of the RGB image segmentation pipeline using a random forest classifier revealed very strong correlation with manual pixel annotation. Depth imaging by a laser distance sensor of plant in vitro cultures enabled the description of the dynamic behavior of the average canopy height, the maximum plant height, but also the culture media height and volume. Projected plant area in depth data by RANSAC (random sample consensus) segmentation approach well matched the projected plant area by RGB image processing pipeline. In addition, a successful proof of concept for in situ spectral fluorescence monitoring was achieved and challenges of thermal imaging were documented. Potential use cases for the digital quantification of key performance parameters in research and commercial application are discussed. Conclusion: The technical realization of “Phenomenon” allows phenotyping of plant in vitro cultures under highly challenging conditions and enables multi-sensory monitoring through closed vessels, ensuring the aseptic status of the cultures. Automated sensor application in plant tissue culture promises great potential for a non-destructive growth analysis enhancing commercial propagation as well as enabling research with novel digital parameters recorded over time

    Development and Evaluation of Unmanned Aerial Vehicles for High Throughput Phenotyping of Field-based Wheat Trials.

    Get PDF
    Growing demands for increased global yields are driving researchers to develop improved crops, capable of securing higher yields in the face of significant challenges including climate change and competition for resources. However, abilities to measure favourable physical characteristics (phenotypes) of key crops in response to these challenges is limited. For crop breeders and researchers, current abilities to phenotype field-based experiments with sufficient precision, resolution and throughput is restricting any meaningful advances in crop development. This PhD thesis presents work focused on the development and evaluation of Unmanned Aerial Vehicles (UAVs) in combination with remote sensing technologies as a solution for improved phenotyping of field-based crop experiments. Chapter 2 presents first, a review of specific target phenotypic traits within the categories of crop morphology and spectral reflectance, together with critical review of current standard measurement protocols. After reviewing phenotypic traits, focus turns to UAVs and UAV specific technologies suitable for the application of crop phenotyping, including critical evaluation of both the strengths and current limitations associated with UAV methods and technologies, highlighting specific areas for improvement. Chapter 3 presents a published paper successfully developing and evaluating Structure from Motion photogrammetry for accurate (R2 ≄ 0.93, RMSE ≀ 0.077m, and Bias ≀ -0.064m) and temporally consistent 3D reconstructions of wheat plot heights. The superior throughput achieved further facilitated measures of crop growth rate through the season; whilst very high spatial resolutions highlighted both the inter- and intra-plot variability in crop heights, something unachievable with the traditional manual ruler methods. Chapter 4 presents published work developing and evaluating modified Commercial ‘Off the Shelf’ (COTS) cameras for obtaining radiometrically calibrated imagery of canopy spectral reflectance. Specifically, development focussed on improving application of these cameras under variable illumination conditions, via application of camera exposure, vignetting, and irradiance corrections. Validation of UAV derived Normalised Difference Vegetation Index (NDVI) against a ground spectrometer from the COTS cameras (0.94 ≀ R2 ≄ 0.88) indicated successful calibration and correction of the cameras. The higher spatial resolution obtained from the COTS cameras, facilitated the assessment of the impact of background soil reflectance on derived mean Normalised Difference Vegetation Index (NDVI) measures of experimental plots, highlighting the impact of incomplete canopy on derived indices. Chapter 5 utilises the developed methods and cameras from Chapter 4 to assess the impact of nitrogen fertiliser application on the formation and senescence dynamics of canopy traits over multiple growing seasons. Quantification of changes in canopy reflectance, via NDVI, through three select trends in the wheat growth cycle were used to assess any impact of nitrogen on these periods of growth. Results showed consistent impact of zero nitrogen application on crop canopies within all three development phases. Additional results found statistically significant positive correlations between quantified phases and harvest metrics (e.g. final yield), with greatest correlations occurring within the second (Full Canopy) and third (Senescence) phases. Chapter 6 focuses on evaluation of the financial costs and throughput associated with UAVs; with specific focus on comparison to conventional methods in a real-world phenotyping scenario. A ‘cost throughput’ analysis based on real-world experiments at Rothamsted Research, provided quantitative assessment demonstrating both the financial savings (ÂŁ4.11 per plot savings) and superior throughput obtained (229% faster) from implementing a UAV based phenotyping strategy to long term phenotyping of field-based experiments. Overall the methods and tools developed in this PhD thesis demonstrate UAVs combined with appropriate remote sensing tools can replicate and even surpass the precision, accuracy, cost and throughput of current strategies

    Generation of 360 Degree Point Cloud for Characterization of Morphological and Chemical Properties of Maize and Sorghum

    Get PDF
    Recently, imaged-based high-throughput phenotyping methods have gained popularity in plant phenotyping. Imaging projects the 3D space into a 2D grid causing the loss of depth information and thus causes the retrieval of plant morphological traits challenging. In this study, LiDAR was used along with a turntable to generate a 360-degree point cloud of single plants. A LABVIEW program was developed to control and synchronize both the devices. A data processing pipeline was built to recover the digital surface models of the plants. The system was tested with maize and sorghum plants to derive the morphological properties including leaf area, leaf angle and leaf angular distribution. The results showed a high correlation between the manual measurement and the LiDAR measurements of the leaf area (R2\u3e0.91). Also, Structure from Motion (SFM) was used to generate 3D spectral point clouds of single plants at different narrow spectral bands using 2D images acquired by moving the camera completely around the plants. Seven narrow band (band width of 10 nm) optical filters, with center wavelengths at 530 nm, 570 nm, 660 nm, 680 nm, 720 nm, 770 nm and 970 nm were used to obtain the images for generating a spectral point cloud. The possibility of deriving the biochemical properties of the plants: nitrogen, phosphorous, potassium and moisture content using the multispectral information from the 3D point cloud was tested through statistical modeling techniques. The results were optimistic and thus indicated the possibility of generating a 3D spectral point cloud for deriving both the morphological and biochemical properties of the plants in the future. Advisor: Yufeng G

    Generation of 360 Degree Point Cloud for Characterization of Morphological and Chemical Properties of Maize and Sorghum

    Get PDF
    Recently, imaged-based high-throughput phenotyping methods have gained popularity in plant phenotyping. Imaging projects the 3D space into a 2D grid causing the loss of depth information and thus causes the retrieval of plant morphological traits challenging. In this study, LiDAR was used along with a turntable to generate a 360-degree point cloud of single plants. A LABVIEW program was developed to control and synchronize both the devices. A data processing pipeline was built to recover the digital surface models of the plants. The system was tested with maize and sorghum plants to derive the morphological properties including leaf area, leaf angle and leaf angular distribution. The results showed a high correlation between the manual measurement and the LiDAR measurements of the leaf area (R2\u3e0.91). Also, Structure from Motion (SFM) was used to generate 3D spectral point clouds of single plants at different narrow spectral bands using 2D images acquired by moving the camera completely around the plants. Seven narrow band (band width of 10 nm) optical filters, with center wavelengths at 530 nm, 570 nm, 660 nm, 680 nm, 720 nm, 770 nm and 970 nm were used to obtain the images for generating a spectral point cloud. The possibility of deriving the biochemical properties of the plants: nitrogen, phosphorous, potassium and moisture content using the multispectral information from the 3D point cloud was tested through statistical modeling techniques. The results were optimistic and thus indicated the possibility of generating a 3D spectral point cloud for deriving both the morphological and biochemical properties of the plants in the future. Advisor: Yufeng G

    Application of unmanned aerial systems for high throughput phenotyping of large wheat breeding nurseries

    Get PDF
    Citation: Haghighattalab, A., Perez, L. G., Mondal, S., Singh, D., Schinstock, D., Rutkoski, J., . . . Poland, J. (2016). Application of unmanned aerial systems for high throughput phenotyping of large wheat breeding nurseries. Plant Methods, 12, 15. https://doi.org/10.1186/s13007-016-0134-6Background: Low cost unmanned aerial systems (UAS) have great potential for rapid proximal measurements of plants in agriculture. In the context of plant breeding and genetics, current approaches for phenotyping a large number of breeding lines under field conditions require substantial investments in time, cost, and labor. For field-based high-throughput phenotyping (HTP), UAS platforms can provide high-resolution measurements for small plot research, while enabling the rapid assessment of tens-of-thousands of field plots. The objective of this study was to complete a baseline assessment of the utility of UAS in assessment field trials as commonly implemented in wheat breeding programs. We developed a semi-automated image-processing pipeline to extract plot level data from UAS imagery. The image dataset was processed using a photogrammetric pipeline based on image orientation and radiometric calibration to produce orthomosaic images. We also examined the relationships between vegetation indices (VIs) extracted from high spatial resolution multispectral imagery collected with two different UAS systems (eBee Ag carrying MultiSpec 4C camera, and IRIS+ quadcopter carrying modified NIR Canon S100) and ground truth spectral data from hand-held spectroradiometer. Results: We found good correlation between the VIs obtained from UAS platforms and ground-truth measurements and observed high broad-sense heritability for VIs. We determined radiometric calibration methods developed for satellite imagery significantly improved the precision of VIs from the UAS. We observed VIs extracted from calibrated images of Canon S100 had a significantly higher correlation to the spectroradiometer (r = 0.76) than VIs from the MultiSpec 4C camera (r = 0.64). Their correlation to spectroradiometer readings was as high as or higher than repeated measurements with the spectroradiometer per se. Conclusion: The approaches described here for UAS imaging and extraction of proximal sensing data enable collection of HTP measurements on the scale and with the precision needed for powerful selection tools in plant breeding. Low-cost UAS platforms have great potential for use as a selection tool in plant breeding programs. In the scope of tools development, the pipeline developed in this study can be effectively employed for other UAS and also other crops planted in breeding nurseries

    3D machine vision system for robotic weeding and plant phenotyping

    Get PDF
    The need for chemical free food is increasing and so is the demand for a larger supply to feed the growing global population. An autonomous weeding system should be capable of differentiating crop plants and weeds to avoid contaminating crops with herbicide or damaging them with mechanical tools. For the plant genetics industry, automated high-throughput phenotyping technology is critical to profiling seedlings at a large scale to facilitate genomic research. This research applied 2D and 3D imaging techniques to develop an innovative crop plant recognition system and a 3D holographic plant phenotyping system. A 3D time-of-flight (ToF) camera was used to develop a crop plant recognition system for broccoli and soybean plants. The developed system overcame the previously unsolved problems caused by occluded canopy and illumination variation. Both 2D and 3D features were extracted and utilized for the plant recognition task. Broccoli and soybean recognition algorithms were developed based on the characteristics of the plants. At field experiments, detection rates of over 88.3% and 91.2% were achieved for broccoli and soybean plants, respectively. The detection algorithm also reached a speed over 30 frame per second (fps), making it applicable for robotic weeding operations. Apart from applying 3D vision for plant recognition, a 3D reconstruction based phenotyping system was also developed for holographic 3D reconstruction and physical trait parameter estimation for corn plants. In this application, precise alignment of multiple 3D views is critical to the 3D reconstruction of a plant. Previously published research highlighted the need for high-throughput, high-accuracy, and low-cost 3D phenotyping systems capable of holographic plant reconstruction and plant morphology related trait characterization. This research contributed to the realization of such a system by integrating a low-cost 2D camera, a low-cost 3D ToF camera, and a chessboard-pattern beacon array to track the 3D camera\u27s position and attitude, thus accomplishing precise 3D point cloud registration from multiple views. Specifically, algorithms of beacon target detection, camera pose tracking, and spatial relationship calibration between 2D and 3D cameras were developed. The phenotypic data obtained by this novel 3D reconstruction based phenotyping system were validated by the experimental data generated by the instrument and manual measurements, showing that the system has achieved measurement accuracy of more than 90% for most cases under an average of less than five seconds processing time per plant
    • 

    corecore