9,198 research outputs found
FlightGoggles: A Modular Framework for Photorealistic Camera, Exteroceptive Sensor, and Dynamics Simulation
FlightGoggles is a photorealistic sensor simulator for perception-driven
robotic vehicles. The key contributions of FlightGoggles are twofold. First,
FlightGoggles provides photorealistic exteroceptive sensor simulation using
graphics assets generated with photogrammetry. Second, it provides the ability
to combine (i) synthetic exteroceptive measurements generated in silico in real
time and (ii) vehicle dynamics and proprioceptive measurements generated in
motio by vehicle(s) in a motion-capture facility. FlightGoggles is capable of
simulating a virtual-reality environment around autonomous vehicle(s). While a
vehicle is in flight in the FlightGoggles virtual reality environment,
exteroceptive sensors are rendered synthetically in real time while all complex
extrinsic dynamics are generated organically through the natural interactions
of the vehicle. The FlightGoggles framework allows for researchers to
accelerate development by circumventing the need to estimate complex and
hard-to-model interactions such as aerodynamics, motor mechanics, battery
electrochemistry, and behavior of other agents. The ability to perform
vehicle-in-the-loop experiments with photorealistic exteroceptive sensor
simulation facilitates novel research directions involving, e.g., fast and
agile autonomous flight in obstacle-rich environments, safe human interaction,
and flexible sensor selection. FlightGoggles has been utilized as the main test
for selecting nine teams that will advance in the AlphaPilot autonomous drone
racing challenge. We survey approaches and results from the top AlphaPilot
teams, which may be of independent interest.Comment: Initial version appeared at IROS 2019. Supplementary material can be
found at https://flightgoggles.mit.edu. Revision includes description of new
FlightGoggles features, such as a photogrammetric model of the MIT Stata
Center, new rendering settings, and a Python AP
A Proposal for a Multi-Drive Heterogeneous Modular Pipe- Inspection Micro-Robot
This paper presents the architecture used to develop a micro-robot for narrow pipes inspection. Both the electromechanical design and the control scheme will be described. In pipe environments it is very useful to have a method to retrieve information of the state of the inside part of the pipes in order to detect damages, breaks and holes. Due to the di_erent types of pipes that exists, a modular approach with di_erent types of modules has been chosen in order to be able to adapt to the shape of the pipe and to chose the most appropriate gait. The micro-robot has been designed for narrow pipes, a _eld in which there are not many prototypes. The robot incorporates a camera module for visual inspection and several drive modules for locomotion and turn (helicoidal, inchworm, two degrees of freedom rotation). The control scheme is based on semi-distributed behavior control and is also described. A simulation environment is also presented for prototypes testing
Nonlinear Model Predictive Control for Multi-Micro Aerial Vehicle Robust Collision Avoidance
Multiple multirotor Micro Aerial Vehicles sharing the same airspace require a
reliable and robust collision avoidance technique. In this paper we address the
problem of multi-MAV reactive collision avoidance. A model-based controller is
employed to achieve simultaneously reference trajectory tracking and collision
avoidance. Moreover, we also account for the uncertainty of the state estimator
and the other agents position and velocity uncertainties to achieve a higher
degree of robustness. The proposed approach is decentralized, does not require
collision-free reference trajectory and accounts for the full MAV dynamics. We
validated our approach in simulation and experimentally.Comment: Video available on: https://www.youtube.com/watch?v=Ot76i9p2ZZo&t=40
Rice-obot 1: An intelligent autonomous mobile robot
The Rice-obot I is the first in a series of Intelligent Autonomous Mobile Robots (IAMRs) being developed at Rice University's Cooperative Intelligent Mobile Robots (CIMR) lab. The Rice-obot I is mainly designed to be a testbed for various robotic and AI techniques, and a platform for developing intelligent control systems for exploratory robots. Researchers present the need for a generalized environment capable of combining all of the control, sensory and knowledge systems of an IAMR. They introduce Lisp-Nodes as such a system, and develop the basic concepts of nodes, messages and classes. Furthermore, they show how the control system of the Rice-obot I is implemented as sub-systems in Lisp-Nodes
Voliro: An Omnidirectional Hexacopter With Tiltable Rotors
Extending the maneuverability of unmanned areal vehicles promises to yield a
considerable increase in the areas in which these systems can be used. Some
such applications are the performance of more complicated inspection tasks and
the generation of complex uninterrupted movements of an attached camera. In
this paper we address this challenge by presenting Voliro, a novel aerial
platform that combines the advantages of existing multi-rotor systems with the
agility of omnidirectionally controllable platforms. We propose the use of a
hexacopter with tiltable rotors allowing the system to decouple the control of
position and orientation. The contributions of this work involve the mechanical
design as well as a controller with the corresponding allocation scheme. This
work also discusses the design challenges involved when turning the concept of
a hexacopter with tiltable rotors into an actual prototype. The agility of the
system is demonstrated and evaluated in real- world experiments.Comment: Submitted to Robotics and Automation Magazin
- …