24 research outputs found

    Bayesian Optimisation for Safe Navigation under Localisation Uncertainty

    Full text link
    In outdoor environments, mobile robots are required to navigate through terrain with varying characteristics, some of which might significantly affect the integrity of the platform. Ideally, the robot should be able to identify areas that are safe for navigation based on its own percepts about the environment while avoiding damage to itself. Bayesian optimisation (BO) has been successfully applied to the task of learning a model of terrain traversability while guiding the robot through more traversable areas. An issue, however, is that localisation uncertainty can end up guiding the robot to unsafe areas and distort the model being learnt. In this paper, we address this problem and present a novel method that allows BO to consider localisation uncertainty by applying a Gaussian process model for uncertain inputs as a prior. We evaluate the proposed method in simulation and in experiments with a real robot navigating over rough terrain and compare it against standard BO methods.Comment: To appear in the proceedings of the 18th International Symposium on Robotics Research (ISRR 2017

    Wheat Height Estimation Using LiDAR in Comparison to Ultrasonic Sensor and UAS

    Get PDF
    As one of the key crop traits, plant height is traditionally evaluated manually, which can be slow, laborious and prone to error. Rapid development of remote and proximal sensing technologies in recent years allows plant height to be estimated in more objective and efficient fashions, while research regarding direct comparisons between different height measurement methods seems to be lagging. In this study, a ground-based multi-sensor phenotyping system equipped with ultrasonic sensors and light detection and ranging (LiDAR) was developed. Canopy heights of 100 wheat plots were estimated five times during a season by the ground phenotyping system and an unmanned aircraft system (UAS), and the results were compared to manual measurements. Overall, LiDAR provided the best results, with a root-mean-square error (RMSE) of 0.05 m and an R2 of 0.97. UAS obtained reasonable results with an RMSE of 0.09 m and an R2 of 0.91. Ultrasonic sensors did not perform well due to our static measurement style. In conclusion, we suggest LiDAR and UAS are reliable alternative methods for wheat height evaluation

    Wheat Height Estimation Using LiDAR in Comparison to Ultrasonic Sensor and UAS

    Get PDF
    As one of the key crop traits, plant height is traditionally evaluated manually, which can be slow, laborious and prone to error. Rapid development of remote and proximal sensing technologies in recent years allows plant height to be estimated in more objective and efficient fashions, while research regarding direct comparisons between different height measurement methods seems to be lagging. In this study, a ground-based multi-sensor phenotyping system equipped with ultrasonic sensors and light detection and ranging (LiDAR) was developed. Canopy heights of 100 wheat plots were estimated five times during a season by the ground phenotyping system and an unmanned aircraft system (UAS), and the results were compared to manual measurements. Overall, LiDAR provided the best results, with a root-mean-square error (RMSE) of 0.05 m and an R2 of 0.97. UAS obtained reasonable results with an RMSE of 0.09 m and an R2 of 0.91. Ultrasonic sensors did not perform well due to our static measurement style. In conclusion, we suggest LiDAR and UAS are reliable alternative methods for wheat height evaluation

    Wheat Height Estimation Using LiDAR in Comparison to Ultrasonic Sensor and UAS

    Get PDF
    As one of the key crop traits, plant height is traditionally evaluated manually, which can be slow, laborious and prone to error. Rapid development of remote and proximal sensing technologies in recent years allows plant height to be estimated in more objective and efficient fashions, while research regarding direct comparisons between different height measurement methods seems to be lagging. In this study, a ground-based multi-sensor phenotyping system equipped with ultrasonic sensors and light detection and ranging (LiDAR) was developed. Canopy heights of 100 wheat plots were estimated five times during a season by the ground phenotyping system and an unmanned aircraft system (UAS), and the results were compared to manual measurements. Overall, LiDAR provided the best results, with a root-mean-square error (RMSE) of 0.05 m and an R2 of 0.97. UAS obtained reasonable results with an RMSE of 0.09 m and an R2 of 0.91. Ultrasonic sensors did not perform well due to our static measurement style. In conclusion, we suggest LiDAR and UAS are reliable alternative methods for wheat height evaluation

    NU-Spidercam: A large-scale, cable-driven, integrated sensing and robotic system for advanced phenotyping, remote sensing, and agronomic research

    Get PDF
    Field-based high throughput plant phenotyping has recently gained increased interest in the efforts to bridge the genotyping and phenotyping gap and accelerate plant breeding for crop improvement. In this paper, we introduce a large-scale, integrated robotic cable-driven sensing system developed at University of Nebraska for field phenotyping research. It is constructed to collect data from a 0.4ha field. The system has a sensor payload of 30kg and offers the flexibility to integrate user defined sensing modules. Currently it integrates a four-band multispectral camera, a thermal infrared camera, a 3D scanning LiDAR, and a portable visible near-infrared spectrometer for plant measurements. Software is designed and developed for instrument control, task planning, and motion control, which enables precise and flexible phenotypic data collection at the plot level. The system also includes a variable-rate subsurface drip irrigation to control water application rates, and an automated weather station to log environmental variables. The system has been in operation for the 2017 and 2018 growing seasons. We demonstrate that the system is reliable and robust, and that fully automated data collection is feasible. Sensor and image data are of high quality in comparison to the ground truth measurements, and capture various aspects of plant traits such as height, ground cover and spectral reflectance. We present two novel datasets enabled by the system, including a plot-level thermal infrared image time-series during a day, and the signal of solar induced chlorophyll fluorescence from canopy reflectance. It is anticipated that the availability of this automated phenotyping system will benefit research in field phenotyping, remote sensing, agronomy, and related disciplines.ISSN:0168-1699ISSN:1872-710

    Extrinsic Parameter Calibration for Line Scanning Cameras on Ground Vehicles with Navigation Systems Using a Calibration Pattern

    Full text link
    Line scanning cameras, which capture only a single line of pixels, have been increasingly used in ground based mobile or robotic platforms. In applications where it is advantageous to directly georeference the camera data to world coordinates, an accurate estimate of the camera's 6D pose is required. This paper focuses on the common case where a mobile platform is equipped with a rigidly mounted line scanning camera, whose pose is unknown, and a navigation system providing vehicle body pose estimates. We propose a novel method that estimates the camera's pose relative to the navigation system. The approach involves imaging and manually labelling a calibration pattern with distinctly identifiable points, triangulating these points from camera and navigation system data and reprojecting them in order to compute a likelihood, which is maximised to estimate the 6D camera pose. Additionally, a Markov Chain Monte Carlo (MCMC) algorithm is used to estimate the uncertainty of the offset. Tested on two different platforms, the method was able to estimate the pose to within 0.06 m / 1.05∘^{\circ} and 0.18 m / 2.39∘^{\circ}. We also propose several approaches to displaying and interpreting the 6D results in a human readable way.Comment: Published in MDPI Sensors, 30 October 201

    Review:New sensors and data-driven approaches—A path to next generation phenomics

    Get PDF
    At the 4th International Plant Phenotyping Symposium meeting of the International Plant Phenotyping Network (IPPN) in 2016 at CIMMYT in Mexico, a workshop was convened to consider ways forward with sensors for phenotyping. The increasing number of field applications provides new challenges and requires specialised solutions. There are many traits vital to plant growth and development that demand phenotyping approaches that are still at early stages of development or elude current capabilities. Further, there is growing interest in low-cost sensor solutions, and mobile platforms that can be transported to the experiments, rather than the experiment coming to the platform. Various types of sensors are required to address diverse needs with respect to targets, precision and ease of operation and readout. Converting data into knowledge, and ensuring that those data (and the appropriate metadata) are stored in such a way that they will be sensible and available to others now and for future analysis is also vital. Here we are proposing mechanisms for “next generation phenomics” based on our learning in the past decade, current practice and discussions at the IPPN Symposium, to encourage further thinking and collaboration by plant scientists, physicists and engineering experts

    High-Throughput Field Imaging and Basic Image Analysis in a Wheat Breeding Programme

    Get PDF
    Visual assessment of colour-based traits plays a key role within field-crop breeding programmes, though the process is subjective and time-consuming. Digital image analysis has previously been investigated as an objective alternative to visual assessment for a limited number of traits, showing suitability and slight improvement to throughput over visual assessment. However, easily adoptable, field-based high-throughput methods are still lacking. The aim of the current study was to produce a high-throughput digital imaging and analysis pipeline for the assessment of colour-based traits within a wheat breeding programme. This was achieved through the steps of (i) a proof-of-concept study demonstrating basic image analysis methods in a greenhouse, (ii) application of these methods to field trials using hand-held imaging, and (iii) developing a field-based high-throughput imaging infrastructure for data collection. The proof of concept study showed a strong correlation (r = 0.95) between visual and digital assessments of wheat physiological yellowing (PY) in a greenhouse environment, with both scores having similar heritability (H2 = 0.85 and 0.76, respectively). Digital assessment of hand-held field images showed strong correlations to visual scores for PY (r = 0.61 and 0.78), senescence (r = 0.74 and 0.75) and Septoria tritici blotch (STB; r = 0.76), with greater heritability of digital scores, excluding STB. Development of the high-throughput imaging infrastructure allowed for images of field plots to be collected at a rate of 7,400 plots per hour. Images of an advanced breeding trial collected with this system were analysed for canopy cover at two time-points, with digital scores correlating strongly to visual scores (r = 0.88 and 0.86) and having similar or greater heritability. This study details how high-throughput digital phenotyping can be applied to colour-based traits within field trials of a wheat breeding programme. It discusses the logistics of implementing such systems with minimal disruption to the programme, provides a detailed methodology for the basic image analysis methods utilized, and has potential for application to other field-crop breeding or research programmes

    Field-based Robot Phenotyping of Sorghum Plant Architecture using Stereo Vision

    Get PDF
    Sorghum (Sorghum bicolor) is known as a major feedstock for biofuel production. To improve its biomass yield through genetic research, manually measuring yield component traits (e.g. plant height, stem diameter, leaf angle, leaf area, leaf number, and panicle size) in the field is the current best practice. However, such laborious and time‐consuming tasks have become a bottleneck limiting experiment scale and data acquisition frequency. This paper presents a high‐throughput field‐based robotic phenotyping system which performed side‐view stereo imaging for dense sorghum plants with a wide range of plant heights throughout the growing season. Our study demonstrated the suitability of stereo vision for field‐based three‐dimensional plant phenotyping when recent advances in stereo matching algorithms were incorporated. A robust data processing pipeline was developed to quantify the variations or morphological traits in plant architecture, which included plot‐based plant height, plot‐based plant width, convex hull volume, plant surface area, and stem diameter (semiautomated). These image‐derived measurements were highly repeatable and showed high correlations with the in‐field manual measurements. Meanwhile, manually collecting the same traits required a large amount of manpower and time compared to the robotic system. The results demonstrated that the proposed system could be a promising tool for large‐scale field‐based high‐throughput plant phenotyping of bioenergy crops
    corecore