17 research outputs found

    A General Purpose Configurable Controller for Indoors and Outdoors GPS-Denied Navigation for Multirotor Unmanned Aerial Vehicles

    Get PDF
    This research on odometry based GPS-denied navigation on multirotor Unmanned Aerial Vehicles is focused among the interactions between the odometry sensors and the navigation controller. More precisely, we present a controller architecture that allows to specify a speed specified flight envelope where the quality of the odometry measurements is guaranteed. The controller utilizes a simple point mass kinematic model, described by a set of configurable parameters, to generate a complying speed plan. For experimental testing, we have used down-facing camera optical-flow as odometry measurement. This work is a continuation of prior research to outdoors environments using an AR Drone 2.0 vehicle, as it provides reliable optical flow on a wide range of flying conditions and floor textures. Our experiments show that the architecture is realiable for outdoors flight on altitudes lower than 9 m. A prior version of our code was utilized to compete in the International Micro Air Vehicle Conference and Flight Competition IMAV 2012. The code will be released as an open-source ROS stack hosted on GitHub

    A System for the Design and Development of Vision-based Multi-robot Quadrotor Swarms

    Get PDF
    This paper presents a cost-effective framework for the prototyping of vision-based quadrotor multi-robot systems, which core characteristics are: modularity, compatibility with different platforms and being flight-proven. The framework is fully operative, which is shown in the paper through simulations and real flight tests of up to 5 drones, and was demonstrated with the participation in an international micro-aerial vehicles competition3 where it was awarded with the First Prize in the Indoors Autonomy Challenge. The motivation of this framework is to allow the developers to focus on their own research by decoupling the development of dependent modules, leading to a more cost-effective progress in the project. The basic instance of the framework that we propose, which is flight-proven with the cost-efficient and reliable platform Parrot AR Drone 2.0 and is open-source, includes several modules that can be reused and modified, such as: a basic sequential mission planner, a basic 2D trajectory planner, an odometry state estimator, localization and mapping modules which obtain absolute position measurements using visual markers, a trajectory controller and a visualization module

    Multirotor UAS Sense and Avoid with Sensor Fusion

    Get PDF
    In this thesis, the key concepts of independent autonomous Unmanned Aircraft Systems (UAS) are explored including obstacle detection, dynamic obstacle state estimation, and avoidance strategy. This area is explored in pursuit of determining the viability of UAS Sense and Avoid (SAA) in static and dynamic operational environments. This exploration is driven by dynamic simulation and post-processing of real-world data. A sensor suite comprised of a 3D Light Detection and Ranging (LIDAR) sensor, visual camera, and 9 Degree of Freedom (DOF) Inertial Measurement Unit (IMU) was found to be beneficial to autonomous UAS SAA in urban environments. Promising results are based on to the broadening of available information about a dynamic or fixed obstacle via pixel-level LIDAR point cloud fusion and the combination of inertial measurements and LIDAR point clouds for localization purposes. However, there is still a significant amount of development required to optimize a data fusion method and SAA guidance method

    A survey of single and multi-UAV aerial manipulation

    Get PDF
    Aerial manipulation has direct application prospects in environment, construction, forestry, agriculture, search, and rescue. It can be used to pick and place objects and hence can be used for transportation of goods. Aerial manipulation can be used to perform operations in environments inaccessible or unsafe for human workers. This paper is a survey of recent research in aerial manipulation. The aerial manipulation research has diverse aspects, which include the designing of aerial manipulation platforms, manipulators, grippers, the control of aerial platform and manipulators, the interaction of aerial manipulator with the environment, through forces and torque. In particular, the review paper presents the survey of the airborne platforms that can be used for aerial manipulation including the new aerial platforms with aerial manipulation capability. We also classified the aerial grippers and aerial manipulators based on their designs and characteristics. The recent contributions regarding the control of the aerial manipulator platform is also discussed. The environment interaction of aerial manipulators is also surveyed which includes, different strategies used for end-effectors interaction with the environment, application of force, application of torque and visual servoing. A recent and growing interest of researchers about the multi-UAV collaborative aerial manipulation was also noticed and hence different strategies for collaborative aerial manipulation are also surveyed, discussed and critically analyzed. Some key challenges regarding outdoor aerial manipulation and energy constraints in aerial manipulation are also discussed

    Vision-based SLAM for the aerial robot ErleCopter

    Get PDF
    El objetivo principal de este trabajo, es la implementación de distintos tipos de algoritmos SLAM (mapeado y localización simultáneos) de visión monocular en el robot aéreo ErleCopter, empleando la plataforma software ROS (Robotic Operating System). Para ello se han escogido un conjunto de tres algoritmos ampliamente utilizados en el campo de la visión artificial: PTAM, ORB-SLAM y LSD-SLAM. Así se llevará a cabo un estudio del funcionamiento de los mismos en el ErleCopter. Además empleando dichos algoritmos, y fusionando la información extraída por estos con la información de otros sensores presentes en la plataforma robótica, se realizará un EKF (Extended Kalman Filter), de forma que podamos predecir la localización del robot de una manera más exacta en entornos interiores, ante la ausencia de sistemas GPS. Para comprobar el funcionamiento del sistema se empleará la plataforma de simulación robótica Gazebo. Por último se realizarán pruebas con el robot real, de forma que podamos observar y extraer conclusiones del funcionamiento de estos algoritmos sobre el propio ErleCopter.The main objective of this thesis is the implementation of different SLAM (Simultaneous Localization and Mapping) algorithms within the aerial robot ErleCopter, using the software platform ROS (Robotic Operating System). To do so, a bunch of three widely known and used algorithms in the field of the artificial vision have been chosen: PTAM, ORB-SLAM y LSD-SALM. So a study of the performance of such algorithms will be carried out in this way. Besides, working with such algorithms and fusing their information with the one obtained by other sensors existing within the robotic platform, an EKF (Extended Kalman Filter) will be carried out, in order to localize the robot more accurately in indoor environments, given the lack of GPS. To test the performance of the system, the robotic platform Gazebo will be used in this project. Finally tests will be made with the real robot, in order to observe and draw conclusions from the performance of these algorithms within the ErleCopter itself.Máster Universitario en Ingeniería Industrial (M141

    A Fully-Autonomous Aerial Robot for Search and Rescue Applications in Indoor Environments using Learning-Based Techniques

    Get PDF
    Search and Rescue (SAR) missions represent an important challenge in the robotics research field as they usually involve exceedingly variable-nature scenarios which require a high-level of autonomy and versatile decision-making capabilities. This challenge becomes even more relevant in the case of aerial robotic platforms owing to their limited payload and computational capabilities. In this paper, we present a fully-autonomous aerial robotic solution, for executing complex SAR missions in unstructured indoor environments. The proposed system is based on the combination of a complete hardware configuration and a flexible system architecture which allows the execution of high-level missions in a fully unsupervised manner (i.e. without human intervention). In order to obtain flexible and versatile behaviors from the proposed aerial robot, several learning-based capabilities have been integrated for target recognition and interaction. The target recognition capability includes a supervised learning classifier based on a computationally-efficient Convolutional Neural Network (CNN) model trained for target/background classification, while the capability to interact with the target for rescue operations introduces a novel Image-Based Visual Servoing (IBVS) algorithm which integrates a recent deep reinforcement learning method named Deep Deterministic Policy Gradients (DDPG). In order to train the aerial robot for performing IBVS tasks, a reinforcement learning framework has been developed, which integrates a deep reinforcement learning agent (e.g. DDPG) with a Gazebo-based simulator for aerial robotics. The proposed system has been validated in a wide range of simulation flights, using Gazebo and PX4 Software-In-The-Loop, and real flights in cluttered indoor environments, demonstrating the versatility of the proposed system in complex SAR missions

    Toward Autonomous Multi-Rotor Indoor Aerial Vehicles

    Get PDF
    In this project, we worked to create an indoor autonomous micro aerial vehicle (MAV) using a multi-layer architecture with modular hardware and software components. We required that all computing was done onboard the vehicle during time of flight so that no remote connection of any kind was necessary for successful control of the vehicle, even when flying autonomously. We utilized environmental sensors including ultrasonic sensors, light detection and ranging modules, and inertial measurement units to acquire necessary environment information for autonomous flight. We used a three-layered system that combined a modular control architecture with distributed on-board computing to allow for fully abstracted layers of control, allowing the individual development and testing of layers. We implemented two layers fully, resulting in increasing autonomous functionality for the MAV, and produced a research platform for development of the third layer. Experimental results demonstrated implementation capabilities including autonomous hovering, obstacle avoidance, and flight data recording

    Vision-Based Control of Unmanned Aerial Vehicles for Automated Structural Monitoring and Geo-Structural Analysis of Civil Infrastructure Systems

    Full text link
    The emergence of wireless sensors capable of sensing, embedded computing, and wireless communication has provided an affordable means of monitoring large-scale civil infrastructure systems with ease. To date, the majority of the existing monitoring systems, including those based on wireless sensors, are stationary with measurement nodes installed without an intention for relocation later. Many monitoring applications involving structural and geotechnical systems require a high density of sensors to provide sufficient spatial resolution to their assessment of system performance. While wireless sensors have made high density monitoring systems possible, an alternative approach would be to empower the mobility of the sensors themselves to transform wireless sensor networks (WSNs) into mobile sensor networks (MSNs). In doing so, many benefits would be derived including reducing the total number of sensors needed while introducing the ability to learn from the data obtained to improve the location of sensors installed. One approach to achieving MSNs is to integrate the use of unmanned aerial vehicles (UAVs) into the monitoring application. UAV-based MSNs have the potential to transform current monitoring practices by improving the speed and quality of data collected while reducing overall system costs. The efforts of this study have been chiefly focused upon using autonomous UAVs to deploy, operate, and reconfigure MSNs in a fully autonomous manner for field monitoring of civil infrastructure systems. This study aims to overcome two main challenges pertaining to UAV-enabled wireless monitoring: the need for high-precision localization methods for outdoor UAV navigation and facilitating modes of direct interaction between UAVs and their built or natural environments. A vision-aided UAV positioning algorithm is first introduced to augment traditional inertial sensing techniques to enhance the ability of UAVs to accurately localize themselves in a civil infrastructure system for placement of wireless sensors. Multi-resolution fiducial markers indicating sensor placement locations are applied to the surface of a structure, serving as navigation guides and precision landing targets for a UAV carrying a wireless sensor. Visual-inertial fusion is implemented via a discrete-time Kalman filter to further increase the robustness of the relative position estimation algorithm resulting in localization accuracies of 10 cm or smaller. The precision landing of UAVs that allows the MSN topology change is validated on a simple beam with the UAV-based MSN collecting ambient response data for extraction of global mode shapes of the structure. The work also explores the integration of a magnetic gripper with a UAV to drop defined weights from an elevation to provide a high energy seismic source for MSNs engaged in seismic monitoring applications. Leveraging tailored visual detection and precise position control techniques for UAVs, the work illustrates the ability of UAVs to—in a repeated and autonomous fashion—deploy wireless geophones and to introduce an impulsive seismic source for in situ shear wave velocity profiling using the spectral analysis of surface waves (SASW) method. The dispersion curve of the shear wave profile of the geotechnical system is shown nearly equal between the autonomous UAV-based MSN architecture and that taken by a traditional wired and manually operated SASW data collection system. The developments and proof-of-concept systems advanced in this study will extend the body of knowledge of robot-deployed MSN with the hope of extending the capabilities of monitoring systems while eradicating the need for human interventions in their design and use.PHDCivil EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/169980/1/zhh_1.pd

    Proceedings of the International Micro Air Vehicles Conference and Flight Competition 2017 (IMAV 2017)

    Get PDF
    The IMAV 2017 conference has been held at ISAE-SUPAERO, Toulouse, France from Sept. 18 to Sept. 21, 2017. More than 250 participants coming from 30 different countries worldwide have presented their latest research activities in the field of drones. 38 papers have been presented during the conference including various topics such as Aerodynamics, Aeroacoustics, Propulsion, Autopilots, Sensors, Communication systems, Mission planning techniques, Artificial Intelligence, Human-machine cooperation as applied to drones
    corecore