3 research outputs found

    Relative position-based collision avoidance system for swarming UAVs using multi-sensor fusion

    Get PDF
    This paper presents the development of a quadrotor unmanned aerial vehicle (UAV) that is capable of quad-directional collision avoidance with obstacles in swarming applications through the implementation of relative position based cascaded PID position and velocity controllers. A collision avoidance algorithm that decides evasive manoeuvres in two dimensional flight by the means of net error calculation was developed. Sensor fusion of ultrasonic (US) and infrared (IR) sensors was performed to obtain a reliable relative position data of obstacles which is then fed into collision avoidance controller (CAC) for generating necessary response in terms of attitude commands. Flight tests performed proved the capability of UAV to avoid collisions with the obstacles and dummy non-flying UAVs that existed at a closer distance in its four primary directions of detections during flight successfully

    Feature-based stereo vision relative positioning strategy for formation control of unmanned aerial vehicles

    No full text
    As inspired by birds flying in flocks, their vision is one of the most critical components to enable them to respond to their neighbor’s motion. In this paper, a novel approach in developing a Vision System as the primary sensor for relative positioning in flight formation of a Leader-Follower scenario is introduced. To use the system in real-time and on-board of the unmanned aerial vehicles (UAVs) with up to 1.5 kilograms of payload capacity, few computing platforms are reviewed and evaluated. The study shows that the NVIDIA Jetson TX1 is the most suited platform for this project. In addition, several different techniques and approaches for developing the algorithm is discussed as well. As per system requirements and conducted study, the algorithm that is developed for this Vision System is based on Tracking and On-Line Machine Learning approach. Flight test has been performed to check the accuracy and reliability of the system, and the results indicate the minimum accuracy of 83% of the vision system against ground truth data

    Implementation of high-gain observer on low-cost fused IR- OS sensor embedded in UAV system

    No full text
    This paper presents discrete time implementation of a high gain observer (HGO) and extended term to estimate the state velocity and acceleration from the position measured by a low-cost sensor installed on-board the unmanned aerial vehicle (UAV). Owing to the low-cost sensor, the signal produced from fused IR–OS is noisy and therefore, additional filters are used to remove the noise. This study proposes an alternative to this standard and tedious procedure using HGO. The discrete time implementation of HGO and its extended term is presented and ground tests are conducted to verify the algorithm by inducing a dynamic motion on the UAV platform embedded with the fusion IR-OS onboard. A comparison study is conducted using standard numerical differentiation and ground truth measurement by OptiTrack. The results show that EHGO can produce a velocity signal with the same quality as that of differentiated signal from fused IR-OS using Kalman filter. The novelty of HGO lies in its simplicity and its minimal tuning of parameters
    corecore