24,720 research outputs found

    Search-based Motion Planning for Aggressive Flight in SE(3)

    Get PDF
    Quadrotors with large thrust-to-weight ratios are able to track aggressive trajectories with sharp turns and high accelerations. In this work, we develop a search-based trajectory planning approach that exploits the quadrotor maneuverability to generate sequences of motion primitives in cluttered environments. We model the quadrotor body as an ellipsoid and compute its flight attitude along trajectories in order to check for collisions against obstacles. The ellipsoid model allows the quadrotor to pass through gaps that are smaller than its diameter with non-zero pitch or roll angles. Without any prior information about the location of gaps and associated attitude constraints, our algorithm is able to find a safe and optimal trajectory that guides the robot to its goal as fast as possible. To accelerate planning, we first perform a lower dimensional search and use it as a heuristic to guide the generation of a final dynamically feasible trajectory. We analyze critical discretization parameters of motion primitive planning and demonstrate the feasibility of the generated trajectories in various simulations and real-world experiments.Comment: 8 pages, submitted to RAL and ICRA 201

    Sampling-based Motion Planning for Active Multirotor System Identification

    Full text link
    This paper reports on an algorithm for planning trajectories that allow a multirotor micro aerial vehicle (MAV) to quickly identify a set of unknown parameters. In many problems like self calibration or model parameter identification some states are only observable under a specific motion. These motions are often hard to find, especially for inexperienced users. Therefore, we consider system model identification in an active setting, where the vehicle autonomously decides what actions to take in order to quickly identify the model. Our algorithm approximates the belief dynamics of the system around a candidate trajectory using an extended Kalman filter (EKF). It uses sampling-based motion planning to explore the space of possible beliefs and find a maximally informative trajectory within a user-defined budget. We validate our method in simulation and on a real system showing the feasibility and repeatability of the proposed approach. Our planner creates trajectories which reduce model parameter convergence time and uncertainty by a factor of four.Comment: Published at ICRA 2017. Video available at https://www.youtube.com/watch?v=xtqrWbgep5

    Real-Time Planning with Multi-Fidelity Models for Agile Flights in Unknown Environments

    Full text link
    Autonomous navigation through unknown environments is a challenging task that entails real-time localization, perception, planning, and control. UAVs with this capability have begun to emerge in the literature with advances in lightweight sensing and computing. Although the planning methodologies vary from platform to platform, many algorithms adopt a hierarchical planning architecture where a slow, low-fidelity global planner guides a fast, high-fidelity local planner. However, in unknown environments, this approach can lead to erratic or unstable behavior due to the interaction between the global planner, whose solution is changing constantly, and the local planner; a consequence of not capturing higher-order dynamics in the global plan. This work proposes a planning framework in which multi-fidelity models are used to reduce the discrepancy between the local and global planner. Our approach uses high-, medium-, and low-fidelity models to compose a path that captures higher-order dynamics while remaining computationally tractable. In addition, we address the interaction between a fast planner and a slower mapper by considering the sensor data not yet fused into the map during the collision check. This novel mapping and planning framework for agile flights is validated in simulation and hardware experiments, showing replanning times of 5-40 ms in cluttered environments.Comment: ICRA 201

    Fast, Autonomous Flight in GPS-Denied and Cluttered Environments

    Full text link
    One of the most challenging tasks for a flying robot is to autonomously navigate between target locations quickly and reliably while avoiding obstacles in its path, and with little to no a-priori knowledge of the operating environment. This challenge is addressed in the present paper. We describe the system design and software architecture of our proposed solution, and showcase how all the distinct components can be integrated to enable smooth robot operation. We provide critical insight on hardware and software component selection and development, and present results from extensive experimental testing in real-world warehouse environments. Experimental testing reveals that our proposed solution can deliver fast and robust aerial robot autonomous navigation in cluttered, GPS-denied environments.Comment: Pre-peer reviewed version of the article accepted in Journal of Field Robotic

    Perception-aware time optimal path parameterization for quadrotors

    Full text link
    The increasing popularity of quadrotors has given rise to a class of predominantly vision-driven vehicles. This paper addresses the problem of perception-aware time optimal path parametrization for quadrotors. Although many different choices of perceptual modalities are available, the low weight and power budgets of quadrotor systems makes a camera ideal for on-board navigation and estimation algorithms. However, this does come with a set of challenges. The limited field of view of the camera can restrict the visibility of salient regions in the environment, which dictates the necessity to consider perception and planning jointly. The main contribution of this paper is an efficient time optimal path parametrization algorithm for quadrotors with limited field of view constraints. We show in a simulation study that a state-of-the-art controller can track planned trajectories, and we validate the proposed algorithm on a quadrotor platform in experiments.Comment: Accepted to appear at ICRA 202

    FASTER: Fast and Safe Trajectory Planner for Flights in Unknown Environments

    Full text link
    High-speed trajectory planning through unknown environments requires algorithmic techniques that enable fast reaction times while maintaining safety as new information about the operating environment is obtained. The requirement of computational tractability typically leads to optimization problems that do not include the obstacle constraints (collision checks are done on the solutions) or use a convex decomposition of the free space and then impose an ad-hoc time allocation scheme for each interval of the trajectory. Moreover, safety guarantees are usually obtained by having a local planner that plans a trajectory with a final "stop" condition in the free-known space. However, these two decisions typically lead to slow and conservative trajectories. We propose FASTER (Fast and Safe Trajectory Planner) to overcome these issues. FASTER obtains high-speed trajectories by enabling the local planner to optimize in both the free-known and unknown spaces. Safety guarantees are ensured by always having a feasible, safe back-up trajectory in the free-known space at the start of each replanning step. Furthermore, we present a Mixed Integer Quadratic Program formulation in which the solver can choose the trajectory interval allocation, and where a time allocation heuristic is computed efficiently using the result of the previous replanning iteration. This proposed algorithm is tested extensively both in simulation and in real hardware, showing agile flights in unknown cluttered environments with velocities up to 3.6 m/s.Comment: IROS 201

    FlightGoggles: A Modular Framework for Photorealistic Camera, Exteroceptive Sensor, and Dynamics Simulation

    Full text link
    FlightGoggles is a photorealistic sensor simulator for perception-driven robotic vehicles. The key contributions of FlightGoggles are twofold. First, FlightGoggles provides photorealistic exteroceptive sensor simulation using graphics assets generated with photogrammetry. Second, it provides the ability to combine (i) synthetic exteroceptive measurements generated in silico in real time and (ii) vehicle dynamics and proprioceptive measurements generated in motio by vehicle(s) in a motion-capture facility. FlightGoggles is capable of simulating a virtual-reality environment around autonomous vehicle(s). While a vehicle is in flight in the FlightGoggles virtual reality environment, exteroceptive sensors are rendered synthetically in real time while all complex extrinsic dynamics are generated organically through the natural interactions of the vehicle. The FlightGoggles framework allows for researchers to accelerate development by circumventing the need to estimate complex and hard-to-model interactions such as aerodynamics, motor mechanics, battery electrochemistry, and behavior of other agents. The ability to perform vehicle-in-the-loop experiments with photorealistic exteroceptive sensor simulation facilitates novel research directions involving, e.g., fast and agile autonomous flight in obstacle-rich environments, safe human interaction, and flexible sensor selection. FlightGoggles has been utilized as the main test for selecting nine teams that will advance in the AlphaPilot autonomous drone racing challenge. We survey approaches and results from the top AlphaPilot teams, which may be of independent interest.Comment: Initial version appeared at IROS 2019. Supplementary material can be found at https://flightgoggles.mit.edu. Revision includes description of new FlightGoggles features, such as a photogrammetric model of the MIT Stata Center, new rendering settings, and a Python AP
    corecore