84 research outputs found
Perception-aware receding horizon trajectory planning for multicopters with visual-inertial odometry
Visual inertial odometry (VIO) is widely used for the state estimation of
multicopters, but it may function poorly in environments with few visual
features or in overly aggressive flights. In this work, we propose a
perception-aware collision avoidance trajectory planner for multicopters, that
may be used with any feature-based VIO algorithm. Our approach is able to fly
the vehicle to a goal position at fast speed, avoiding obstacles in an unknown
stationary environment while achieving good VIO state estimation accuracy. The
proposed planner samples a group of minimum jerk trajectories and finds
collision-free trajectories among them, which are then evaluated based on their
speed to the goal and perception quality. Both the motion blur of features and
their locations are considered for the perception quality. Our novel
consideration of the motion blur of features enables automatic adaptation of
the trajectory's aggressiveness under environments with different light levels.
The best trajectory from the evaluation is tracked by the vehicle and is
updated in a receding horizon manner when new images are received from the
camera. Only generic assumptions about the VIO are made, so that the planner
may be used with various existing systems. The proposed method can run in
real-time on a small embedded computer on board. We validated the effectiveness
of our proposed approach through experiments in both indoor and outdoor
environments. Compared to a perception-agnostic planner, the proposed planner
kept more features in the camera's view and made the flight less aggressive,
making the VIO more accurate. It also reduced VIO failures, which occurred for
the perception-agnostic planner but not for the proposed planner. The ability
of the proposed planner to fly through dense obstacles was also validated. The
experiment video can be found at https://youtu.be/qO3LZIrpwtQ.Comment: 12 page
Brain over Brawn -- Using a Stereo Camera to Detect, Track and Intercept a Faster UAV by Reconstructing Its Trajectory
The work presented in this paper demonstrates our approach to intercepting a
faster intruder UAV, inspired by the MBZIRC2020 Challenge 1. By leveraging the
knowledge of the shape of the intruder's trajectory we are able to calculate
the interception point. Target tracking is based on image processing by a
YOLOv3 Tiny convolutional neural network, combined with depth calculation using
a gimbal-mounted ZED Mini stereo camera. We use RGB and depth data from ZED
Mini to extract the 3D position of the target, for which we devise a
histogram-of-depth based processing to reduce noise. Obtained 3D measurements
of target's position are used to calculate the position, the orientation and
the size of a figure-eight shaped trajectory, which we approximate using
lemniscate of Bernoulli. Once the approximation is deemed sufficiently precise,
measured by Hausdorff distance between measurements and the approximation, an
interception point is calculated to position the intercepting UAV right on the
path of the target. The proposed method, which has been significantly improved
based on the experience gathered during the MBZIRC competition, has been
validated in simulation and through field experiments. The results confirmed
that an efficient visual perception module which extracts information related
to the motion of the target UAV as a basis for the interception, has been
developed. The system is able to track and intercept the target which is 30%
faster than the interceptor in majority of simulation experiments. Tests in the
unstructured environment yielded 9 out of 12 successful results.Comment: To be published in Field Robotics. UAV-Eagle dataset available at:
https://github.com/larics/UAV-Eagl
Low computational SLAM for an autonomous indoor aerial inspection vehicle
The past decade has seen an increase in the capability of small scale Unmanned
Aerial Vehicle (UAV) systems, made possible through technological advancements
in battery, computing and sensor miniaturisation technology. This has opened a new
and rapidly growing branch of robotic research and has sparked the imagination of
industry leading to new UAV based services, from the inspection of power-lines to
remote police surveillance.
Miniaturisation of UAVs have also made them small enough to be practically flown
indoors. For example, the inspection of elevated areas in hazardous or damaged
structures where the use of conventional ground-based robots are unsuitable. Sellafield
Ltd, a nuclear reprocessing facility in the U.K. has many buildings that require
frequent safety inspections. UAV inspections eliminate the current risk to personnel
of radiation exposure and other hazards in tall structures where scaffolding or hoists
are required.
This project focused on the development of a UAV for the novel application of
semi-autonomously navigating and inspecting these structures without the need for
personnel to enter the building. Development exposed a significant gap in knowledge
concerning indoor localisation, specifically Simultaneous Localisation and Mapping
(SLAM) for use on-board UAVs. To lower the on-board processing requirements
of SLAM, other UAV research groups have employed techniques such as off-board
processing, reduced dimensionality or prior knowledge of the structure, techniques
not suitable to this application given the unknown nature of the structures and the
risk of radio-shadows.
In this thesis a novel localisation algorithm, which enables real-time and threedimensional
SLAM running solely on-board a computationally constrained UAV in
heavily cluttered and unknown environments is proposed. The algorithm, based
on the Iterative Closest Point (ICP) method utilising approximate nearest neighbour
searches and point-cloud decimation to reduce the processing requirements has
successfully been tested in environments similar to that specified by Sellafield Ltd
Design of a Specialized UAV Platform for the Discharge of a Fire Extinguishing Capsule
Tato práce se zabývá návrhem systému specializovaného pro autonomní detekci a lokalizaci požárů z palubních senzorů bezpilotních helikoptér. Hašení požárů je zajištěno automatickým vystřelením ampule s hasící kapalinou do zdroje požáru z palubního vystřelovače. Hlavní část této práce se soustředí na detekci požárů v datech termální kamery a jejich následnou lokalizaci ve světě za pomoci palubní hloubkové kamery. Bezpilotní helikoptéra je poté optimálně navigována na pozici pro zajištění průletu ampule s hasící kapalinou do zdroje požáru. Vyvinuté metody jsou detailně analyzovány a jejich chování je testováno jak v simulaci, tak současně i při reálných experimentech. Kvalitativní a kvantitativní analýza ukazuje na použitelnost a robustnost celého systému.This thesis deals with the design of an unmanned multirotor aircraft system specialized for autonomous detection and localization of fires from onboard sensors, and the task of fast and effective fire extinguishment. The main part of this thesis focuses on the detection of fires in thermal images and their localization in the world using an onboard depth camera. The localized fires are used to optimally position the unmanned aircraft in order to effectively discharge an ampoule filled with a fire extinguishant from an onboard launcher. The developed methods are analyzed in detail and their performance is evaluated in simulation scenarios as well as in real-world experiments. The included quantitative and qualitative analysis verifies the feasibility and robustness of the system
Trajectory tracking control of an aerial manipulator in presence of disturbances and modeling uncertainties
Development and dynamic validation of control techniques for trajectory tracking of a robotic manipulator mounted on a UAV. Tracking performances are evaluated in a context of simulated dynamic disturbance on manipulator base
GNSS-Free Localization for UAVs in the Wild
Considering the accelerated development of Unmanned Aerial Vehicles (UAVs) applications in both industrial and research scenarios, there is an increasing need for localizing these aerial systems in non-urban environments, using GNSS-Free, vision-based methods. This project studies three different image feature matching techniques and proposes a final implementation of a vision-based localization algorithm that uses deep features to compute geographical coordinates of a UAV flying in the wild. The method is based on matching salient features of RGB photographs captured by the drone camera and sections of a pre-built map consisting of georeferenced open-source satellite images. Experimental results prove that vision-based localization has comparable accuracy with traditional GNSS-based methods, which serve as ground truth
An open-source autopilot and bio-inspired source localisation strategies for miniature blimps
An Uncrewed Aerial Vehicle (UAV) is an airborne vehicle that has no people onboard and thus is either controlled remotely via radio signals or by autonomous capability. This thesis highlights the feasibility of using a bio-inspired miniature lighter than air UAV for indoor applications. While multicopters are the most used type of UAV, the smaller multicopter UAVs used in indoor applications have short flight times and are fragile making them vulnerable to collisions. For tasks such as gas source localisation where the agent would be deployed to detect a gas plume, the amount of air disturbance they create is a disadvantage. Miniature blimps are another type of UAV that are more suited to indoor applications due to their significantly higher collision tolerance. This thesis focuses on the development of a bio-inspired miniature blimp, called FishBlimp. A blimp generally creates significantly less disturbance to the airflow as it doesn’t have to support its own weight. This also usually enables much longer flight times. Using fins instead of propellers for propulsion further reduces the air disturbance as the air velocity is lower.
FishBlimp has four fins attached in different orientations along the perimeter of a helium filled spherical envelope to enable it to move along the cardinal axes and yaw. Support for this new vehicle-type was added to the open-source flight control firmware called ArduPilot. Manual control and autonomous functions were developed for this platform to enable position hold and velocity control mode, implemented using a cascaded PID controller. Flight tests revealed that FishBlimp displayed position control with maximum overshoot of about 0.28m and has a maximum flight speed of 0.3m/s.
FishBlimp was then applied to source localisation, firstly as a single agent seeking to identify a plume source using a modified Cast & Surge algorithm. FishBlimp was also developed in simulation to perform source localisation with multiple blimps, using a Particle Swarm Optimisation (PSO) algorithm. This enabled them to work cooperatively in order to reduce the time taken for them to find the source. This shows the potential of a platform like FishBlimp to carry out successful indoor source localisation missions
- …