17 research outputs found

    Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments

    Get PDF
    RGB-D cameras provide both color images and per-pixel depth estimates. The richness of this data and the recent development of low-cost sensors have combined to present an attractive opportunity for mobile robotics research. In this paper, we describe a system for visual odometry and mapping using an RGB-D camera, and its application to autonomous flight. By leveraging results from recent state-of-the-art algorithms and hardware, our system enables 3D flight in cluttered environments using only onboard sensor data. All computation and sensing required for local position control are performed onboard the vehicle, reducing the dependence on an unreliable wireless link to a ground station. However, even with accurate 3D sensing and position estimation, some parts of the environment have more perceptual structure than others, leading to state estimates that vary in accuracy across the environment. If the vehicle plans a path without regard to how well it can localize itself along that path, it runs the risk of becoming lost or worse. We show how the belief roadmap algorithm prentice2009belief, a belief space extension of the probabilistic roadmap algorithm, can be used to plan vehicle trajectories that incorporate the sensing model of the RGB-D camera. We evaluate the effectiveness of our system for controlling a quadrotor micro air vehicle, demonstrate its use for constructing detailed 3D maps of an indoor environment, and discuss its limitations.United States. Office of Naval Research (Grant MURI N00014-07-1-0749)United States. Office of Naval Research (Science of Autonomy Program N00014-09-1-0641)United States. Army Research Office (MAST CTA)United States. Office of Naval Research. Multidisciplinary University Research Initiative (Grant N00014-09-1-1052)National Science Foundation (U.S.) (Contract IIS-0812671)United States. Army Research Office (Robotics Consortium Agreement W911NF-10-2-0016)National Science Foundation (U.S.). Division of Information, Robotics, and Intelligent Systems (Grant 0546467

    Learning Unmanned Aerial Vehicle Control for Autonomous Target Following

    Full text link
    While deep reinforcement learning (RL) methods have achieved unprecedented successes in a range of challenging problems, their applicability has been mainly limited to simulation or game domains due to the high sample complexity of the trial-and-error learning process. However, real-world robotic applications often need a data-efficient learning process with safety-critical constraints. In this paper, we consider the challenging problem of learning unmanned aerial vehicle (UAV) control for tracking a moving target. To acquire a strategy that combines perception and control, we represent the policy by a convolutional neural network. We develop a hierarchical approach that combines a model-free policy gradient method with a conventional feedback proportional-integral-derivative (PID) controller to enable stable learning without catastrophic failure. The neural network is trained by a combination of supervised learning from raw images and reinforcement learning from games of self-play. We show that the proposed approach can learn a target following policy in a simulator efficiently and the learned behavior can be successfully transferred to the DJI quadrotor platform for real-world UAV control

    Development and performance testing of a miniaturized multi-sensor system combining MOX and PID for potential UAV application in TIC, VOC and CWA dispersion scenarios

    Get PDF
    The development of a tool to reduce the exposure of personnel in case of inten- tional or accidental toxic chemicals dispersion scenarios opens the field to new operational perspectives in the domain of operator safety and of critical infrastructure monitoring. The use of two sensors with different operating principles, metal oxide and photo-ionization detector, allows to confirm the presence of specific classes of chemicals in a contaminated area. All instruments are expected to be integrated into the payload of an unmanned aerial vehicle (UAV) and used for different purposes such as critical infrastructure surveillance focused on the volatile organic chemical and chemical warfare agents (CWA) detection and the post-incident of contamination level monitoring. In this paper, the authors presented the hardware set-up implemented and the test realized with CWAs simulants and will discuss the results obtained presenting advantages and disadvantages of this system in an application such as a UAV for the detection of chemical substances

    Obstacle avoidance based-visual navigation for micro aerial vehicles

    Get PDF
    This paper describes an obstacle avoidance system for low-cost Unmanned Aerial Vehicles (UAVs) using vision as the principal source of information through the monocular onboard camera. For detecting obstacles, the proposed system compares the image obtained in real time from the UAV with a database of obstacles that must be avoided. In our proposal, we include the feature point detector Speeded Up Robust Features (SURF) for fast obstacle detection and a control law to avoid them. Furthermore, our research includes a path recovery algorithm. Our method is attractive for compact MAVs in which other sensors will not be implemented. The system was tested in real time on a Micro Aerial Vehicle (MAV), to detect and avoid obstacles in an unknown controlled environment; we compared our approach with related works.Peer ReviewedPostprint (published version

    Perception-aware Path Planning

    Full text link
    In this paper, we give a double twist to the problem of planning under uncertainty. State-of-the-art planners seek to minimize the localization uncertainty by only considering the geometric structure of the scene. In this paper, we argue that motion planning for vision-controlled robots should be perception aware in that the robot should also favor texture-rich areas to minimize the localization uncertainty during a goal-reaching task. Thus, we describe how to optimally incorporate the photometric information (i.e., texture) of the scene, in addition to the the geometric one, to compute the uncertainty of vision-based localization during path planning. To avoid the caveats of feature-based localization systems (i.e., dependence on feature type and user-defined thresholds), we use dense, direct methods. This allows us to compute the localization uncertainty directly from the intensity values of every pixel in the image. We also describe how to compute trajectories online, considering also scenarios with no prior knowledge about the map. The proposed framework is general and can easily be adapted to different robotic platforms and scenarios. The effectiveness of our approach is demonstrated with extensive experiments in both simulated and real-world environments using a vision-controlled micro aerial vehicle.Comment: 16 pages, 20 figures, revised version. Conditionally accepted for IEEE Transactions on Robotic

    Design of Miniaturized Sensors for a Mission-Oriented UAV Application: A New Pathway for Early Warning

    Get PDF
    In recent decades, the increasing threats associated with Chemical and Radiological (CR) agents prompted the development of new tools to detect and collect samples without putting in danger first responders inside contaminated areas. A particularly promising branch of these technological developments relates to the integration of different detectors and sampling systems with Unmanned Aerial Vehicles (UAV). The adoption of this equipment may bring significant benefits for both military and civilian implementations. For instance, instrumented UAVs could be used in support of specialist military teams such as Sampling and Identification of Biological, Chemical and Radiological Agents (SIBCRA) team, tasked to perform sampling in contaminated areas, detecting the presence of CR substances in field and then confirming, collecting and evaluating the effective threats. Furthermore, instrumented UAVs may find dual-use application in the civil world in support of emergency teams during industrial accidents and in the monitoring activities of critical infrastructures. Small size drones equipped with different instruments for detection and collection of samples may enable, indeed, several applications, becoming a tool versatile and easy to use in different fields, and even featuring equipment normally utilized in manual operation. The authors hereby present the design of miniaturized sensors for a mission-oriented UAV application and the preliminary results from an experimental campaign performed in 2020

    Design of miniaturized sensors for a mission-oriented uav application: A new pathway for early warning

    Get PDF
    In recent decades, the increasing threats associated with Chemical and Radiological (CR) agents prompted the development of new tools to detect and collect samples without putting in danger first responders inside contaminated areas. A particularly promising branch of these technological developments relates to the integration of different detectors and sampling systems with Unmanned Aerial Vehicles (UAV). The adoption of this equipment may bring significant benefits for both military and civilian implementations. For instance, instrumented UAVs could be used in support of specialist military teams such as Sampling and Identification of Biological, Chemical and Radiological Agents (SIBCRA) team, tasked to perform sampling in contaminated areas, detecting the presence of CR substances in field and then confirming, collecting and evaluating the effective threats. Furthermore, instrumented UAVs may find dual-use application in the civil world in support of emergency teams during industrial accidents and in the monitoring activities of critical infrastructures. Small size drones equipped with different instruments for detection and collection of samples may enable, indeed, several applications, becoming a tool versatile and easy to use in different fields, and even featuring equipment normally utilized in manual operation. The authors hereby present the design of miniaturized sensors for a mission-oriented UAV application and the preliminary results from an experimental campaign performed in 2020

    Robot Trajectories Comparison: A Statistical Approach

    Get PDF
    The task of planning a collision-free trajectory from a start to a goal position is fundamental for an autonomous mobile robot. Although path planning has been extensively investigated since the beginning of robotics, there is no agreement on how to measure the performance of a motion algorithm. This paper presents a new approach to perform robot trajectories comparison that could be applied to any kind of trajectories and in both simulated and real environments. Given an initial set of features, it automatically selects the most significant ones and performs a statistical comparison using them. Additionally, a graphical data visualization named polygraph which helps to better understand the obtained results is provided. The proposed method has been applied, as an example, to compare two different motion planners, FM2 and WaveFront, using different environments, robots, and local planners

    MAPEAMENTO 3D DE AMBIENTES INTERNOS USANDO DADOS RGB-D

    Get PDF
    Neste trabalho é introduzido um fluxo de mapeamento 3D de ambientes internos usando dados RGB-D. O método explora a integração de imagens RGB e valores de profundidade oriundos do dispositivo Kinect. Cinco etapas principais envolvidas no desenvolvimento do método proposto são discutidas. A primeira etapa trata da detecção de pontos no par de imagens RGB e o estabelecimento automático das correspondências. Na segunda etapa do método é proposto uma normalização das imagens RGB e IR para associar os pontos homólogos encontrados no par de imagens RGB e seus correspondentes na imagem de profundidade. Na terceira etapa as coordenadas XYZ de cada ponto são calculadas. Em seguida, são calculados os parâmetros de transformação entre os pares de nuvem de pontos 3D. Finalmente, é proposto um modelo linear para a análise da consistência global. Para avaliar a eficiência e potencialidade do método proposto foram realizados quatro experimentos em ambientes internos. Uma avaliação da acurácia relativa da trajetória do sensor mostrou erros no registro de pares de nuvens de pontos em torno de 3,0 cm
    corecore