1,547 research outputs found

    FlightGoggles: A Modular Framework for Photorealistic Camera, Exteroceptive Sensor, and Dynamics Simulation

    Full text link
    FlightGoggles is a photorealistic sensor simulator for perception-driven robotic vehicles. The key contributions of FlightGoggles are twofold. First, FlightGoggles provides photorealistic exteroceptive sensor simulation using graphics assets generated with photogrammetry. Second, it provides the ability to combine (i) synthetic exteroceptive measurements generated in silico in real time and (ii) vehicle dynamics and proprioceptive measurements generated in motio by vehicle(s) in a motion-capture facility. FlightGoggles is capable of simulating a virtual-reality environment around autonomous vehicle(s). While a vehicle is in flight in the FlightGoggles virtual reality environment, exteroceptive sensors are rendered synthetically in real time while all complex extrinsic dynamics are generated organically through the natural interactions of the vehicle. The FlightGoggles framework allows for researchers to accelerate development by circumventing the need to estimate complex and hard-to-model interactions such as aerodynamics, motor mechanics, battery electrochemistry, and behavior of other agents. The ability to perform vehicle-in-the-loop experiments with photorealistic exteroceptive sensor simulation facilitates novel research directions involving, e.g., fast and agile autonomous flight in obstacle-rich environments, safe human interaction, and flexible sensor selection. FlightGoggles has been utilized as the main test for selecting nine teams that will advance in the AlphaPilot autonomous drone racing challenge. We survey approaches and results from the top AlphaPilot teams, which may be of independent interest.Comment: Initial version appeared at IROS 2019. Supplementary material can be found at https://flightgoggles.mit.edu. Revision includes description of new FlightGoggles features, such as a photogrammetric model of the MIT Stata Center, new rendering settings, and a Python AP

    Kalman Filter for Noise Reduction in Aerial Vehicles using Echoic Flow

    Get PDF
    Echolocation is a natural phenomenon observed in bats that allows them to navigate complex, dim environments with enough precision to capture insects in midair. Echolocation is driven by the underlying process of echoic flow, which can be broken down into a ratio of the distance from a target to the velocity towards it. This ratio produces a parameter τ representing the time to collision, and controlling it allows for highly efficient and consistent movement. When a quadcopter uses echoic flow to descend to a target, measurements from the ultrasonic range sensor exhibit noise. Furthermore, the use of first order derivatives to calculate the echoic flow parameters results in an even greater magnitude of noise. The implementation of an optimal Kalman filter to smooth measurements allows for more accurate and precise tracking, ultimately recreating the high efficiency and consistency of echolocation tracking techniques found in nature. Kalman filter parameters were tested in realistic simulations of the quadcopter's descent. These tests determined an optimal Kalman filter for the system. The Kalman filter's effect on an accurate echoic flow descent was then tested against that of other filtering methods. Of the filtering methods tested, Kalman filtering best allowed the quadcopter to control its echoic flow descent in a precise and consistent manner. In this presentation, the test methodology and results of the various tests are presented.No embargoAcademic Major: Electrical and Computer Engineerin

    A follow-me algorithm for AR.Drone using MobileNet-SSD and PID control

    Get PDF
    Treballs Finals de Grau d'Enginyeria InformĂ tica, Facultat de MatemĂ tiques, Universitat de Barcelona, Any: 2018, Director: LluĂ­s Garrido Ostermann[en] In recent years the industry of quadcopters has experimented a boost. The appearance of inexpensive drones has led to the growth of the recreational use of this vehicles, which opens the door to the creation of new applications and technologies. This thesis presents a vision-based autonomous control system for an AR.Drone 2.0. A tracking algorithm is developed using onboard vision systems without relying on additional external inputs. In particular, the tracking algorithm is the combination of a trained MobileNet-SSD object detector and a KCF tracker. The noise induced by the tracker is decreased with a Kalman filter. Furthermore, PID controllers are implemented for the motion control of the quadcopter, which process the output of the tracking algorithm to move the drone to the desired position. The final implementation was tested indoors and the system yields acceptable results

    Autonomous moving target-tracking for a UAV quadcopter based on fuzzy-PI.

    Get PDF
    Moving target-tracking is an attractive application for quadcopters and a very challenging, complicated field of research due to the complex dynamics of a quadcopter and the varying speed of the moving target with time. For this reason, various control algorithms have been developed to track a moving target using a camera. In this paper, a Fuzzy-PI controller is developed to adjust the parameters of the PI controller using the position and change of position data as input. The proposed controller is compared to a gain-scheduled PID controller instead of the typical PID controller. To verify the performance of the developed system and distinguish which one has better performance, several experiments of a quadcopter tracking a moving target are conducted under the varying speed of the moving target, indoor and outdoor and during day and night. The obtained results indicate that the proposed controller works well for tracking a moving target under different scenarios, especially during night
    • 

    corecore