35,799 research outputs found

    The Phoenix Drone: An Open-Source Dual-Rotor Tail-Sitter Platform for Research and Education

    Full text link
    In this paper, we introduce the Phoenix drone: the first completely open-source tail-sitter micro aerial vehicle (MAV) platform. The vehicle has a highly versatile, dual-rotor design and is engineered to be low-cost and easily extensible/modifiable. Our open-source release includes all of the design documents, software resources, and simulation tools needed to build and fly a high-performance tail-sitter for research and educational purposes. The drone has been developed for precision flight with a high degree of control authority. Our design methodology included extensive testing and characterization of the aerodynamic properties of the vehicle. The platform incorporates many off-the-shelf components and 3D-printed parts, in order to keep the cost down. Nonetheless, the paper includes results from flight trials which demonstrate that the vehicle is capable of very stable hovering and accurate trajectory tracking. Our hope is that the open-source Phoenix reference design will be useful to both researchers and educators. In particular, the details in this paper and the available open-source materials should enable learners to gain an understanding of aerodynamics, flight control, state estimation, software design, and simulation, while experimenting with a unique aerial robot.Comment: In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA'19), Montreal, Canada, May 20-24, 201

    Dynamic Arrival Rate Estimation for Campus Mobility on Demand Network Graphs

    Full text link
    Mobility On Demand (MOD) systems are revolutionizing transportation in urban settings by improving vehicle utilization and reducing parking congestion. A key factor in the success of an MOD system is the ability to measure and respond to real-time customer arrival data. Real time traffic arrival rate data is traditionally difficult to obtain due to the need to install fixed sensors throughout the MOD network. This paper presents a framework for measuring pedestrian traffic arrival rates using sensors onboard the vehicles that make up the MOD fleet. A novel distributed fusion algorithm is presented which combines onboard LIDAR and camera sensor measurements to detect trajectories of pedestrians with a 90% detection hit rate with 1.5 false positives per minute. A novel moving observer method is introduced to estimate pedestrian arrival rates from pedestrian trajectories collected from mobile sensors. The moving observer method is evaluated in both simulation and hardware and is shown to achieve arrival rate estimates comparable to those that would be obtained with multiple stationary sensors.Comment: Appears in 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). http://ieeexplore.ieee.org/abstract/document/7759357

    Accurate Tracking of Aggressive Quadrotor Trajectories using Incremental Nonlinear Dynamic Inversion and Differential Flatness

    Full text link
    Autonomous unmanned aerial vehicles (UAVs) that can execute aggressive (i.e., high-speed and high-acceleration) maneuvers have attracted significant attention in the past few years. This paper focuses on accurate tracking of aggressive quadcopter trajectories. We propose a novel control law for tracking of position and yaw angle and their derivatives of up to fourth order, specifically, velocity, acceleration, jerk, and snap along with yaw rate and yaw acceleration. Jerk and snap are tracked using feedforward inputs for angular rate and angular acceleration based on the differential flatness of the quadcopter dynamics. Snap tracking requires direct control of body torque, which we achieve using closed-loop motor speed control based on measurements from optical encoders attached to the motors. The controller utilizes incremental nonlinear dynamic inversion (INDI) for robust tracking of linear and angular accelerations despite external disturbances, such as aerodynamic drag forces. Hence, prior modeling of aerodynamic effects is not required. We rigorously analyze the proposed control law through response analysis, and we demonstrate it in experiments. The controller enables a quadcopter UAV to track complex 3D trajectories, reaching speeds up to 12.9 m/s and accelerations up to 2.1g, while keeping the root-mean-square tracking error down to 6.6 cm, in a flight volume that is roughly 18 m by 7 m and 3 m tall. We also demonstrate the robustness of the controller by attaching a drag plate to the UAV in flight tests and by pulling on the UAV with a rope during hover.Comment: To be published in IEEE Transactions on Control Systems Technology. Revision: new set of experiments at increased speed (up to 12.9 m/s), updated controller design using quaternion representation, new video available at https://youtu.be/K15lNBAKDC

    Leveraging Traffic and Surveillance Video Cameras for Urban Traffic

    Get PDF
    The objective of this project was to investigate the use of existing video resources, such as traffic cameras, police cameras, red light cameras, and security cameras for the long-term, real-time collection of traffic statistics. An additional objective was to gather similar statistics for pedestrians and bicyclists. Throughout the course of the project, we investigated several methods for tracking vehicles under challenging conditions. The initial plan called for tracking based on optical flow. However, it was found that current optical flow–estimating algorithms are not well suited to low-quality video—hence, developing optical flow methods for low-quality video has been one aspect of this project. The method eventually used combines basic optical flow tracking with a learning detector for each tracked object—that is, the object is tracked both by its apparent movement and by its appearance should it temporarily disappear from or be obscured in the frame. We have produced a prototype software that allows the user to specify the vehicle trajectories of interest by drawing their shapes superimposed on a video frame. The software then tracks each vehicle as it travels through the frame, matches the vehicle’s movements to the most closely matching trajectory, and increases the vehicle count for that trajectory. In terms of pedestrian and bicycle counting, the system is capable of tracking these “objects” as well, though at present it is not capable of distinguishing between the three classes automatically. Continuing research by the principal investigator under a different grant will establish this capability as well.Illinois Department of Transportation, R27-131Ope

    FlightGoggles: A Modular Framework for Photorealistic Camera, Exteroceptive Sensor, and Dynamics Simulation

    Full text link
    FlightGoggles is a photorealistic sensor simulator for perception-driven robotic vehicles. The key contributions of FlightGoggles are twofold. First, FlightGoggles provides photorealistic exteroceptive sensor simulation using graphics assets generated with photogrammetry. Second, it provides the ability to combine (i) synthetic exteroceptive measurements generated in silico in real time and (ii) vehicle dynamics and proprioceptive measurements generated in motio by vehicle(s) in a motion-capture facility. FlightGoggles is capable of simulating a virtual-reality environment around autonomous vehicle(s). While a vehicle is in flight in the FlightGoggles virtual reality environment, exteroceptive sensors are rendered synthetically in real time while all complex extrinsic dynamics are generated organically through the natural interactions of the vehicle. The FlightGoggles framework allows for researchers to accelerate development by circumventing the need to estimate complex and hard-to-model interactions such as aerodynamics, motor mechanics, battery electrochemistry, and behavior of other agents. The ability to perform vehicle-in-the-loop experiments with photorealistic exteroceptive sensor simulation facilitates novel research directions involving, e.g., fast and agile autonomous flight in obstacle-rich environments, safe human interaction, and flexible sensor selection. FlightGoggles has been utilized as the main test for selecting nine teams that will advance in the AlphaPilot autonomous drone racing challenge. We survey approaches and results from the top AlphaPilot teams, which may be of independent interest.Comment: Initial version appeared at IROS 2019. Supplementary material can be found at https://flightgoggles.mit.edu. Revision includes description of new FlightGoggles features, such as a photogrammetric model of the MIT Stata Center, new rendering settings, and a Python AP

    Vision and Learning for Deliberative Monocular Cluttered Flight

    Full text link
    Cameras provide a rich source of information while being passive, cheap and lightweight for small and medium Unmanned Aerial Vehicles (UAVs). In this work we present the first implementation of receding horizon control, which is widely used in ground vehicles, with monocular vision as the only sensing mode for autonomous UAV flight in dense clutter. We make it feasible on UAVs via a number of contributions: novel coupling of perception and control via relevant and diverse, multiple interpretations of the scene around the robot, leveraging recent advances in machine learning to showcase anytime budgeted cost-sensitive feature selection, and fast non-linear regression for monocular depth prediction. We empirically demonstrate the efficacy of our novel pipeline via real world experiments of more than 2 kms through dense trees with a quadrotor built from off-the-shelf parts. Moreover our pipeline is designed to combine information from other modalities like stereo and lidar as well if available
    • …
    corecore