46 research outputs found

    Autonomous deployment and repair of a sensor network using an unmanned aerial vehicle

    Get PDF
    We describe a sensor network deployment method using autonomous flying robots. Such networks are suitable for tasks such as large-scale environmental monitoring or for command and control in emergency situations. We describe in detail the algorithms used for deployment and for measuring network connectivity and provide experimental data we collected from field trials. A particular focus is on determining gaps in connectivity of the deployed network and generating a plan for a second, repair, pass to complete the connectivity. This project is the result of a collaboration between three robotics labs (CSIRO, USC, and Dartmouth.)

    The Ginninderra CH4 and CO2 release experiment: An evaluation of gas detection and quantification techniques

    Get PDF
    A methane (CH4) and carbon dioxide (CO2) release experiment was held from April to June 2015 at the Ginninderra Controlled Release Facility in Canberra, Australia. The experiment provided an opportunity to compare different emission quantification techniques against a simulated CH4 and CO2 point source release, where the actual release rates were unknown to the participants. Eight quantification techniques were assessed: three tracer ratio techniques (two mobile); backwards Lagrangian stochastic modelling; forwards Lagrangian stochastic modelling; Lagrangian stochastic (LS) footprint modelling; atmospheric tomography using point and using integrated line sensors. The majority of CH4 estimates were within 20% of the actual CH4 release rate (5.8 g/min), with the tracer ratio technique providing the closest estimate to both the CH4 and CO2 release rates (100 g/min). Once the release rate was known, the majority of revised estimates were within 10% of the actual release rate. The study illustrates the power of measuring the emission rate using multiple simultaneous methods and obtaining an ensemble median or mean. An ensemble approach to estimating the CH4 emission rate proved successful with the ensemble median estimate within 16% for the actual release rate for the blind release experiment and within 2% once the release rate was known. The release also provided an opportunity to assess the effectiveness of stationary and mobile ground and aerial CH4 detection technologies. Sensor detection limits and sampling rates were found to be significant limitations for CH4 and CO2 detection. A hyperspectral imager\u27s capacity to image the CH4 release from 100 m, and a Boreal CH4 laser sensor\u27s ability to track moving targets suggest the future possibility to map gas plumes using a single laser and mobile aerial reflector

    High dynamic range stereo vision for outdoor mobile robotics

    Get PDF
    We present a technique for high-dynamic range stereo for outdoor mobile robot applications. Stereo pairs are captured at a number of different exposures (exposure bracketing), and combined by projecting the 3D points into a common coordinate frame, and building a 3D occupancy map. We present experimental results for static scenes with constant and dynamic lighting as well as outdoor operation with variable and high contrast lighting conditions

    Design and control of an animatronic Aardvark

    Get PDF
    Includes bibliographical references.This report describes the design, construction and programming of an animatronic Aardvark that was built and successfully used in the filming of a wildlife documentary for National Geographic. The animatron was required to walk, move its head, and have as many facial movements as possible. These requirements were met by using hobby servos to produce the movements, and control was achieved with a Motorola based micro controller (the Handy Board). The proportions of the animatron were based on those or a real aardvark, made to approximately 1/4 scale. The final product met all the requirements and was filmed on location interacting with a real aardvark

    Evaluation of machine vision techniques for aerial search of humans in maritime environments

    Get PDF
    Searching for humans lost in vast stretches of ocean has always been a difficult task. In this paper, a range of machine vision approaches are investigated as candidate tools to mitigate the risk of human fatigue and complacency after long hours performing these kind of search tasks. Our two-phased approach utilises point target detection followed by temporal tracking of these targets. Four different point target detection techniques and two tracking techniques are evaluated. We also evaluate the use of different colour spaces for target detection. This paper has a particular focus on Hidden Markov Model based tracking techniques, which seem best able to incorporate a priori knowledge about the maritime search problem, to improve detection performance

    Improved maritime target tracker using colour fusion

    Get PDF
    Searching for humans lost in vast stretches of ocean has always been a difficult task. This paper investigates a machine vision system that addresses this problem by exploiting the useful properties of alternate colour spaces. In particular, the paper investigates the fusion of colour information from the HSV, RGB, YCbCr and YIQ colour spaces within the emission matrix of a Hidden Markov Model tracker to enhance video based maritime target detection. The system has shown promising results. The paper also identifies challenges still needing to be met

    Combined optic-flow and stereo-based navigation of urban canyons for a UAV

    Get PDF
    We present a novel vision-based technique for navigating an Unmanned Aerial Vehicle (UAV) through urban canyons. Our technique relies on both optic flow and stereo vision information. We show that the combination of stereo and optic-flow (stereo-flow) is more effective at navigating urban canyons than either technique alone. Optic flow from a pair of sideways-looking cameras is used to stay centered in a canyon and initiate turns at junctions, while stereo vision from a forward-facing stereo head is used to avoid obstacles to the front. The technique was tested in full on an autonomous tractor at CSIRO and in part on the USC autonomous helicopter. Experimental results are presented from these two robotic platforms operating in outdoor environments. We show that the autonomous tractor can navigate urban canyons using stereoflow, and that the autonomous helicopter can turn away from obstacles to the side using optic flow. In addition, preliminary results show that a single pair of forward-facing fisheye cameras can be used for both stereo and optic flow. The center portions of the fisheye images are used for stereo, while flow is measured in the periphery of the images

    PTZ camera pose estimation by tracking a 3D target

    Get PDF
    We present a technique for estimating the 6DOF pose of a PTZ camera by tracking a single moving target in the image with known 3D position. This is useful in situations where it is not practical to measure the camera pose directly. Our application domain is estimating the pose of a PTZ camerso so that it can be used for automated GPS-based tracking and filming of UAV flight trials. We present results which show the technique is able to localize a PTZ after a short vision-tracked flight, and that the estimated pose is sufficiently accurate for the PTZ to then actively track a UAV based on GPS position data
    corecore