4,002 research outputs found

    AWARE: Platform for Autonomous self-deploying and operation of Wireless sensor-actuator networks cooperating with unmanned AeRial vehiclEs

    Get PDF
    This paper presents the AWARE platform that seeks to enable the cooperation of autonomous aerial vehicles with ground wireless sensor-actuator networks comprising both static and mobile nodes carried by vehicles or people. Particularly, the paper presents the middleware, the wireless sensor network, the node deployment by means of an autonomous helicopter, and the surveillance and tracking functionalities of the platform. Furthermore, the paper presents the first general experiments of the AWARE project that took place in March 2007 with the assistance of the Seville fire brigades

    Cooperative Virtual Sensor for Fault Detection and Identification in Multi-UAV Applications

    Get PDF
    This paper considers the problem of fault detection and identification (FDI) in applications carried out by a group of unmanned aerial vehicles (UAVs) with visual cameras. In many cases, the UAVs have cameras mounted onboard for other applications, and these cameras can be used as bearing-only sensors to estimate the relative orientation of another UAV. The idea is to exploit the redundant information provided by these sensors onboard each of the UAVs to increase safety and reliability, detecting faults on UAV internal sensors that cannot be detected by the UAVs themselves. Fault detection is based on the generation of residuals which compare the expected position of a UAV, considered as target, with the measurements taken by one or more UAVs acting as observers that are tracking the target UAV with their cameras. Depending on the available number of observers and the way they are used, a set of strategies and policies for fault detection are defined. When the target UAV is being visually tracked by two or more observers, it is possible to obtain an estimation of its 3D position that could replace damaged sensors. Accuracy and reliability of this vision-based cooperative virtual sensor (CVS) have been evaluated experimentally in a multivehicle indoor testbed with quadrotors, injecting faults on data to validate the proposed fault detection methods.Comisión Europea H2020 644271Comisión Europea FP7 288082Ministerio de Economia, Industria y Competitividad DPI2015-71524-RMinisterio de Economia, Industria y Competitividad DPI2014-5983-C2-1-RMinisterio de Educación, Cultura y Deporte FP

    Evaluating the accuracy of vehicle tracking data obtained from Unmanned Aerial Vehicles

    Get PDF
    Abstract This paper presents a methodology for tracking moving vehicles that integrates Unmanned Aerial Vehicles with video processing techniques. The authors investigated the usefulness of Unmanned Aerial Vehicles to capture reliable individual vehicle data by using GPS technology as a benchmark. A video processing algorithm for vehicles trajectory acquisition is introduced. The algorithm is based on OpenCV libraries. In order to assess the accuracy of the proposed video processing algorithm an instrumented vehicle was equipped with a high precision GPS. The video capture experiments were performed in two case studies. From the field, about 24,000 positioning data were acquired for the analysis. The results of these experiments highlight the versatility of the Unmanned Aerial Vehicles technology combined with video processing technique in monitoring real traffic data

    PAMPC: Perception-Aware Model Predictive Control for Quadrotors

    Full text link
    We present the first perception-aware model predictive control framework for quadrotors that unifies control and planning with respect to action and perception objectives. Our framework leverages numerical optimization to compute trajectories that satisfy the system dynamics and require control inputs within the limits of the platform. Simultaneously, it optimizes perception objectives for robust and reliable sens- ing by maximizing the visibility of a point of interest and minimizing its velocity in the image plane. Considering both perception and action objectives for motion planning and control is challenging due to the possible conflicts arising from their respective requirements. For example, for a quadrotor to track a reference trajectory, it needs to rotate to align its thrust with the direction of the desired acceleration. However, the perception objective might require to minimize such rotation to maximize the visibility of a point of interest. A model-based optimization framework, able to consider both perception and action objectives and couple them through the system dynamics, is therefore necessary. Our perception-aware model predictive control framework works in a receding-horizon fashion by iteratively solving a non-linear optimization problem. It is capable of running in real-time, fully onboard our lightweight, small-scale quadrotor using a low-power ARM computer, to- gether with a visual-inertial odometry pipeline. We validate our approach in experiments demonstrating (I) the contradiction between perception and action objectives, and (II) improved behavior in extremely challenging lighting conditions
    corecore