58 research outputs found

    Free as a Bird: Event-Based Dynamic Sense-and-Avoid for Ornithopter Robot Flight

    Get PDF
    Autonomous flight of flapping-wing robots is a major challenge for robot perception. Most of the previous sense-and-avoid works have studied the problem of obstacle avoidance for flapping-wing robots considering only static obstacles. This letter presents a fully onboard dynamic sense-and-avoid scheme for large-scale ornithopters using event cameras. These sensors trigger pixel information due to changes of illumination in the scene such as those produced by dynamic objects. The method performs event-by-event processing in low-cost hardware such as those onboard small aerial vehicles. The proposed scheme detects obstacles and evaluates possible collisions with the robot body. The onboard controller actuates over the horizontal and vertical tail deflections to execute the avoidance maneuver. The scheme is validated in both indoor and outdoor scenarios using obstacles of different shapes and sizes. To the best of the authors’ knowledge, this is the first event-based method for dynamic obstacle avoidance in a flapping-wing robot.Consejo Europeo de Investigación (ERC) 788247Comisión Europea - Proyecto AERIAL-CORE H2020-2019-871479Ministerio de Universidades FPU19/0469

    In-Flight Collision Avoidance Controller Based Only on OS4 Embedded Sensors

    Get PDF
    The major goal of this research was the development and implementation of a control system able to avoid collisions during the flight for a mini-quadrotor helicopter, based only on its embedded sensors without changing the environment. However, it is important to highlight that the design aspects must be seriously considered in order to overcome hardware limitations and achieve control simplification. The controllers of a UAV (Unmanned Aerial Vehicle) robot deal with highly unstable dynamics and strong axes coupling. Furthermore, any additional embedded sensor increases the robot total weight and therefore, decreases its operating time. The best balance between embedded electronics and robot operating time is desired. This paper focuses not only on the development and implementation of a collision avoidance controller for a mini-robotic helicopter using only its embedded sensors, but also on the mathematical model that was essential for the controller developing phases. Based on this model we carried out the development of a simulation tool based on MatLab/Simulink that was fundamental for setting the controllers' parameters. This tool allowed us to simulate and improve the OS4 controllers in different modeled environments and test different approaches. After that, the controllers were embedded in the real robot and the results proved to be very robust and feasible. In addition to this, the controller has the advantage of being compatible with future path planners that we are developing.Brazilian Agency: CAPES (Coordenacao de Aperfeicoamento de Pessoal de Nivel Superior)Brazilian Agency: CNPq (National Council for Scientific and Technological Development

    Towards Low-Latency High-Bandwidth Control of Quadrotors using Event Cameras

    Full text link
    Event cameras are a promising candidate to enable high speed vision-based control due to their low sensor latency and high temporal resolution. However, purely event-based feedback has yet to be used in the control of drones. In this work, a first step towards implementing low-latency high-bandwidth control of quadrotors using event cameras is taken. In particular, this paper addresses the problem of one-dimensional attitude tracking using a dualcopter platform equipped with an event camera. The event-based state estimation consists of a modified Hough transform algorithm combined with a Kalman filter that outputs the roll angle and angular velocity of the dualcopter relative to a horizon marked by a black-and-white disk. The estimated state is processed by a proportional-derivative attitude control law that computes the rotor thrusts required to track the desired attitude. The proposed attitude tracking scheme shows promising results of event-camera-driven closed loop control: the state estimator performs with an update rate of 1 kHz and a latency determined to be 12 ms, enabling attitude tracking at speeds of over 1600 deg/s

    ColibriUAV: An Ultra-Fast, Energy-Efficient Neuromorphic Edge Processing UAV-Platform with Event-Based and Frame-Based Cameras

    Full text link
    The interest in dynamic vision sensor (DVS)-powered unmanned aerial vehicles (UAV) is raising, especially due to the microsecond-level reaction time of the bio-inspired event sensor, which increases robustness and reduces latency of the perception tasks compared to a RGB camera. This work presents ColibriUAV, a UAV platform with both frame-based and event-based cameras interfaces for efficient perception and near-sensor processing. The proposed platform is designed around Kraken, a novel low-power RISC-V System on Chip with two hardware accelerators targeting spiking neural networks and deep ternary neural networks.Kraken is capable of efficiently processing both event data from a DVS camera and frame data from an RGB camera. A key feature of Kraken is its integrated, dedicated interface with a DVS camera. This paper benchmarks the end-to-end latency and power efficiency of the neuromorphic and event-based UAV subsystem, demonstrating state-of-the-art event data with a throughput of 7200 frames of events per second and a power consumption of 10.7 \si{\milli\watt}, which is over 6.6 times faster and a hundred times less power-consuming than the widely-used data reading approach through the USB interface. The overall sensing and processing power consumption is below 50 mW, achieving latency in the milliseconds range, making the platform suitable for low-latency autonomous nano-drones as well

    Why fly blind? Event-based visual guidance for ornithopter robot flight

    Get PDF
    Under licence Creative Commons - Green Open Access (IEEE).The development of perception and control methods that allow bird-scale flapping-wing robots (a.k.a. ornithopters) to perform autonomously is an under-researched area. This paper presents a fully onboard event-based method for ornithopter robot visual guidance. The method uses event cameras to exploit their fast response and robustness against motion blur in order to feed the ornithopter control loop at high rates (100 Hz). The proposed scheme visually guides the robot using line features extracted in the event image plane and controls the flight by actuating over the horizontal and vertical tail deflections. It has been validated on board a real ornithopter robot with real-time computation in low-cost hardware. The experimental evaluation includes sets of experiments with different maneuvers indoors and outdoors.Consejo Europeo de Investigación (ERC) 78824

    Design and control of quadrotors with application to autonomous flying

    Get PDF
    This thesis is about modelling, design and control of Miniature Flying Robots (MFR) with a focus on Vertical Take-Off and Landing (VTOL) systems and specifically, micro quadrotors. It introduces a mathematical model for simulation and control of such systems. It then describes a design methodology for a miniature rotorcraft. The methodology is subsequently applied to design an autonomous quadrotor named OS4. Based on the mathematical model, linear and nonlinear control techniques are used to design and simulate various controllers along this work. The dynamic model and the simulator evolved from a simple set of equations, valid only for hovering, to a complex mathematical model with more realistic aerodynamic coefficients and sensor and actuator models. Two platforms were developed during this thesis. The first one is a quadrotor-like test-bench with off-board data processing and power supply. It was used to safely and easily test control strategies. The second one, OS4, is a highly integrated quadrotor with on-board data processing and power supply. It has all the necessary sensors for autonomous operation. Five different controllers were developed. The first one, based on Lyapunov theory, was applied for attitude control. The second and the third controllers are based on PID and LQ techniques. These were compared for attitude control. The fourth and the fifth approaches use backstepping and sliding-mode concepts. They are applied to control attitude. Finally, backstepping is augmented with integral action and proposed as a single tool to design attitude, altitude and position controllers. This approach is validated through various flight experiments conducted on the OS4

    Master of Science

    Get PDF
    thesisThis thesis provides details on the development of automatic collision avoidance for manually tele-operated unmanned aerial vehicles. We note that large portions of this work are also reprinted with permission, from 2014 IEEE International Conference on Robotics and Automation, \Automatic Collision Avoidance for Manually Tele-operated Unmanned Aerial Vehicles", by J. Israelsen, M. Beall, D. Bareiss, D. Stuart, E. Keeney, and J. van den Berg c 2014 IEEE. We provide a method to aid the operator of unmanned aerial vehicles. We do this by automatically performing collision avoidance with obstacles in the environment. Our method allows the operator to focus on the overall motion of the vehicle rather than requiring the operator to perform collision avoidance. Where other currently existing systems override the controls of the operator only as a last resort, our approach was developed such that the operator can rely on the automatic collision avoidance for maneuverability. Given the current operator control input, our approach continually determines the future path of the vehicle. If along the future path a collision is predicted, then our algorithm will minimally override the operator's control such that the vehicle will not collide with the obstacles in the environment. Such an approach ensures the safety of the operator's controls while simultaneously maintaining the original intent of the operator. We successfully implemented this approach in a simulated environment, as well as on a physical quadrotor system in a laboratory environment. Our experiments show that, even when intentionally trying to do so, the operator failed to crash the vehicle into environment obstacles

    High Speed Event Camera TRacking

    Get PDF
    Event cameras are bioinspired sensors with reaction times in the order of microseconds. This property makes them appealing for use in highly-dynamic computer vision applications. In this work,we explore the limits of this sensing technology and present an ultra-fast tracking algorithm able to estimate six-degree-of-freedom motion with dynamics over 25.8 g, at a throughput of 10 kHz,processing over a million events per second. Our method is capable of tracking either camera motion or the motion of an object in front of it, using an error-state Kalman filter formulated in a Lie-theoretic sense. The method includes a robust mechanism for the matching of events with projected line segments with very fast outlier rejection. Meticulous treatment of sparse matrices is applied to achieve real-time performance. Different motion models of varying complexity are considered for the sake of comparison and performance analysi
    corecore