26 research outputs found

    Aerial-Ground collaborative sensing: Third-Person view for teleoperation

    Full text link
    Rapid deployment and operation are key requirements in time critical application, such as Search and Rescue (SaR). Efficiently teleoperated ground robots can support first-responders in such situations. However, first-person view teleoperation is sub-optimal in difficult terrains, while a third-person perspective can drastically increase teleoperation performance. Here, we propose a Micro Aerial Vehicle (MAV)-based system that can autonomously provide third-person perspective to ground robots. While our approach is based on local visual servoing, it further leverages the global localization of several ground robots to seamlessly transfer between these ground robots in GPS-denied environments. Therewith one MAV can support multiple ground robots on a demand basis. Furthermore, our system enables different visual detection regimes, and enhanced operability, and return-home functionality. We evaluate our system in real-world SaR scenarios.Comment: Accepted for publication in 2018 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR

    Effective Target Aware Visual Navigation for UAVs

    Full text link
    In this paper we propose an effective vision-based navigation method that allows a multirotor vehicle to simultaneously reach a desired goal pose in the environment while constantly facing a target object or landmark. Standard techniques such as Position-Based Visual Servoing (PBVS) and Image-Based Visual Servoing (IBVS) in some cases (e.g., while the multirotor is performing fast maneuvers) do not allow to constantly maintain the line of sight with a target of interest. Instead, we compute the optimal trajectory by solving a non-linear optimization problem that minimizes the target re-projection error while meeting the UAV's dynamic constraints. The desired trajectory is then tracked by means of a real-time Non-linear Model Predictive Controller (NMPC): this implicitly allows the multirotor to satisfy both the required constraints. We successfully evaluate the proposed approach in many real and simulated experiments, making an exhaustive comparison with a standard approach.Comment: Conference paper at "European Conference on Mobile Robotics" (ECMR) 201

    Outdoor Autonomous Landing of a Quadcopter on a Moving Platform using Off-board Computer Vision

    Get PDF
     This paper presents a method that enables a quadcopter to perform autonomous landing on a moving platform using computer vision. In addition, the system implementation of the computer vision technique is presented. Unlike other researches, the camera is mounted on the moving platform instead of being installed on the quadcopter. Besides, the computer vision system is tested outdoor, and the results such as the performance and the accuracy are presented. In the stationary platform test, 5 out of 10 landings fall within 30 cm from the center. In the moving platform test, the maximum platform-moving speed for autonomous landing is 2 m/s. Hence, it is proven that this methodology is feasible. Lastly, the advantages and limitations of the computer vision technique proposed are discussed

    Asymptotic Vision-Based Tracking Control of the Quadrotor Aerial Vehicle

    Get PDF
    This paper proposes an image-based visual servo (IBVS) controller for the 3D translational motion of the quadrotor unmanned aerial vehicle (UAV). The main purpose of this paper is to provide asymptotic stability for vision-based tracking control of the quadrotor in the presence of uncertainty in the dynamic model of the system. The aim of the paper also includes the use of ow of image features as the velocity information to compensate for the unreliable linear velocity data measured by accelerometers. For this purpose, the mathematical model of the quadrotor is presented based on the optic ow of image features which provides the possibility of designing a velocity-free IBVS controller with considering the dynamics of the robot. The image features are de ned from a suitable combination of perspective image moments without using the model of the object. This property allows the application of the proposed controller in unknown places. The controller is robust with respect to the uncertainties in the transla- tional dynamics of the system associated with the target motion, image depth and external disturbances. Simulation results and a comparison study are presented which demonstrate the e ectiveness of the proposed approach

    Asymptotic Vision-Based Tracking Control of the Quadrotor Aerial Vehicle

    Get PDF

    Real-time UAV Complex Missions Leveraging Self-Adaptive Controller with Elastic Structure

    Full text link
    The expectation of unmanned air vehicles (UAVs) pushes the operation environment to narrow spaces, where the systems may fly very close to an object and perform an interaction. This phase brings the variation in UAV dynamics: thrust and drag coefficient of the propellers might change under different proximity. At the same time, UAVs may need to operate under external disturbances to follow time-based trajectories. Under these challenging conditions, a standard controller approach may not handle all missions with a fixed structure, where there may be a need to adjust its parameters for each different case. With these motivations, practical implementation and evaluation of an autonomous controller applied to a quadrotor UAV are proposed in this work. A self-adaptive controller based on a composite control scheme where a combination of sliding mode control (SMC) and evolving neuro-fuzzy control is used. The parameter vector of the neuro-fuzzy controller is updated adaptively based on the sliding surface of the SMC. The autonomous controller possesses a new elastic structure, where the number of fuzzy rules keeps growing or get pruned based on bias and variance balance. The interaction of the UAV is experimentally evaluated in real time considering the ground effect, ceiling effect and flight through a strong fan-generated wind while following time-based trajectories.Comment: 18 page
    corecore