361 research outputs found

    Smart environment monitoring through micro unmanned aerial vehicles

    Get PDF
    In recent years, the improvements of small-scale Unmanned Aerial Vehicles (UAVs) in terms of flight time, automatic control, and remote transmission are promoting the development of a wide range of practical applications. In aerial video surveillance, the monitoring of broad areas still has many challenges due to the achievement of different tasks in real-time, including mosaicking, change detection, and object detection. In this thesis work, a small-scale UAV based vision system to maintain regular surveillance over target areas is proposed. The system works in two modes. The first mode allows to monitor an area of interest by performing several flights. During the first flight, it creates an incremental geo-referenced mosaic of an area of interest and classifies all the known elements (e.g., persons) found on the ground by an improved Faster R-CNN architecture previously trained. In subsequent reconnaissance flights, the system searches for any changes (e.g., disappearance of persons) that may occur in the mosaic by a histogram equalization and RGB-Local Binary Pattern (RGB-LBP) based algorithm. If present, the mosaic is updated. The second mode, allows to perform a real-time classification by using, again, our improved Faster R-CNN model, useful for time-critical operations. Thanks to different design features, the system works in real-time and performs mosaicking and change detection tasks at low-altitude, thus allowing the classification even of small objects. The proposed system was tested by using the whole set of challenging video sequences contained in the UAV Mosaicking and Change Detection (UMCD) dataset and other public datasets. The evaluation of the system by well-known performance metrics has shown remarkable results in terms of mosaic creation and updating, as well as in terms of change detection and object detection

    MusA: Using Indoor Positioning and Navigation to Enhance Cultural Experiences in a museum

    Get PDF
    In recent years there has been a growing interest into the use of multimedia mobile guides in museum environments. Mobile devices have the capabilities to detect the user context and to provide pieces of information suitable to help visitors discovering and following the logical and emotional connections that develop during the visit. In this scenario, location based services (LBS) currently represent an asset, and the choice of the technology to determine users' position, combined with the definition of methods that can effectively convey information, become key issues in the design process. In this work, we present MusA (Museum Assistant), a general framework for the development of multimedia interactive guides for mobile devices. Its main feature is a vision-based indoor positioning system that allows the provision of several LBS, from way-finding to the contextualized communication of cultural contents, aimed at providing a meaningful exploration of exhibits according to visitors' personal interest and curiosity. Starting from the thorough description of the system architecture, the article presents the implementation of two mobile guides, developed to respectively address adults and children, and discusses the evaluation of the user experience and the visitors' appreciation of these application

    Novel Image Mosaicking of UAV’s Imagery using Collinearity Condition

    Get PDF
    This paper presents a preliminary result of ongoing research on unmanned aerial vehicle (UAV) for cooperative mapping to support a large-scale urban city mapping, in Malang, Indonesia. A small UAV can carry an embedded camera which can continuously take pictures of landscapes. A convenient way of monitoring landscape changes might be through accessing a sequence of images. However, since the camera’s field of view is always smaller than human eye’s field of view, the need to combine aerial pictures into a single mosaic is eminent. Through mosaics, a more complete view of the scene can be accessed and analyzed. A semi-automated generation of mosaics is investigated using a photogrammetric approach, namely a perspective projection which is based on collinearity condition. This paper reviews the general projection model based on collinearity condition and uses that to determine a common projective plane from images. The overlapped points for each RGB channel are interpolated onto that of orthographic plane to generate mosaics. An initial attempt shows a promising result

    An evaluation of imagery from an unmanned aerial vehicle (UAV) for the mapping of intertidal macroalgae on Seal Sands, Tees Estuary, UK

    Get PDF
    The Seal Sands area of Teesmouth is designated a Special Protection Area under the habitats directive because guideline concentrations of nutrients in coastal waters are exceeded. This may be responsible for extensive growth of the green filamentous macroalgae Enteromorpha sp., and literature suggests that algal cover in the intertidal zone is detrimental to the feeding behaviour of wading bird species. Although numerous studies have highlighted the causes and consequences of macroalgal cover, the complex spatial and temporal dynamics of macroalgal bloom growth are not as well understood, and hence there is a need to develop a precise and cost effective monitoring method for the mapping and quantifying of algal biomass. Previous studies have highlighted several image processing techniques that could be applied to high resolution airborne imagery in order to predict algal biomass. In order to test these methods, high resolution imagery was acquired in the Sea Õ¬ Sands area using a lightweight SmartPlanes SmartOne unmanned aerial vehicle (UAV) equipped with a near-infrared sensitive 5-megapixel Canon IXUS compact camera, a standard 6-megapixel Canon IXUS compact camera and a Garmin Geko 201 handheld GPS device. Imagery was acquired in November 2006 and June 2007 in order to examine the spectral response of Enteromorpha sp. at different time periods within a macroalgal growth cycle. Images were mosaicked and georeferenced using ground control points located with a Leica 1200 differential GPS and processed to allow for analysis of their spectral and textural properties. Samples of macroalgal cover were collected, georeferenced and their dry biomass content obtained for ground truthing. Although textural entropy and inertia did not correlate significantly with macroalgal biomass, normalised green-red difference index (NGRDI), normalised difference vegetation index (NDVI) and colour saturation computed on the imagery showed a good degree of linear correlation with Enteromorpha sp. dry weight, achieving coefficients of determination in excess of r(^2)= 0.6 for both the November2006 and June 2007 image sets. Linear regression was used to establish predictive models to estimate macroalgal biomass from image spectral properties. Enteromorpha sp. Biomass estimations of 71.4 g DW m(^-2) and 7.9g DW m(^-2) were established for the November 2006 and June2007 data acquisition sessions respectively. Despite a lack of previous biomass quantification for Seal Sands, the favourable performance of a UAV in terms of operating cost and man hours required for image acquisition suggests that unmanned aerial vehicles may present a viable method for the mapping of intertidal algal biomass on an annual basis

    Monitorización 3D de cultivos y cartografía de malas hierbas mediante vehículos aéreos no tripulados para un uso sostenible de fitosanitarios

    Get PDF
    En esta Tesis Doctoral se han utilizado las imágenes procedentes de un UAV para abordar la sostenibilidad de la aplicación de productos fitosanitarios mediante la generación de mapas que permitan su aplicación localizada. Se han desarrollado dos formas diferentes y complementarias para lograr este objetivo: 1) la reducción de la aplicación de herbicidas en post-emergencia temprana mediante el diseño de tratamientos dirigidos a las zonas infestadas por malas hierbas en varios cultivos herbáceos; y 2) la caracterización tridimensional (arquitectura y volumen) de cultivos leñosos para el diseño de tratamientos de aplicación localizada de fitosanitarios dirigidos a la parte aérea de los mismos. Para afrontar el control localizado de herbicidas se han estudiado la configuración y las especificaciones técnicas de un UAV y de los sensores embarcados a bordo para su aplicación en la detección temprana de malas hierbas y contribuir a la generación de mapas para un control localizado en tres cultivos herbáceos: maíz, trigo y girasol. A continuación, se evaluaron los índices espectrales más precisos para su uso en la discriminación de suelo desnudo y vegetación (cultivo y malas hierbas) en imágenes-UAV tomadas sobre dichos cultivos en fase temprana. Con el fin de automatizar dicha discriminación se implementó en un entorno OBIA un método de cálculo de umbrales. Finalmente, se desarrolló una metodología OBIA automática y robusta para la discriminación de cultivo, suelo desnudo y malas hierbas en los tres cultivos estudiados, y se evaluó la influencia sobre su funcionamiento de distintos parámetros relacionados con la toma de imágenes UAV (solape, tipo de sensor, altitud de vuelo, momento de programación de los vuelos, entre otros). Por otra parte y para facilitar el diseño de tratamientos fitosanitarios ajustados a las necesidades de los cultivos leñosos se ha desarrollado una metodología OBIA automática y robusta para la caracterización tridimensional (arquitectura y volumen) de cultivos leñosos usando imágenes y modelos digitales de superficies generados a partir de imágenes procedentes de un UAV. Asimismo, se evaluó la influencia de distintos parámetros relacionados con la toma de las imágenes (solape, tipo de sensor, altitud de vuelo) sobre el funcionamiento del algoritmo OBIA diseñado

    Autonomous Service Drones for Multimodal Detection and Monitoring of Archaeological Sites

    Get PDF
    Constant detection and monitoring of archaeological sites and objects have always been an important national goal for many countries. The early identification of changes is crucial to preventive conservation. Archaeologists have always considered using service drones to automate collecting data on and below the ground surface of archaeological sites, with cost and technical barriers being the main hurdles against the wide-scale deployment. Advances in thermal imaging, depth imaging, drones, and artificial intelligence have driven the cost down and improved the quality and volume of data collected and processed. This paper proposes an end-to-end framework for archaeological sites detection and monitoring using autonomous service drones. We mount RGB, depth, and thermal cameras on an autonomous drone for low-altitude data acquisition. To align and aggregate collected images, we propose two-stage multimodal depth-to-RGB and thermal-to-RGB mosaicking algorithms. We then apply detection algorithms to the stitched images to identify change regions and design a user interface to monitor these regions over time. Our results show we can create overlays of aligned thermal and depth data on RGB mosaics of archaeological sites. We tested our change detection algorithm and found it has a root mean square error of 0.04. To validate the proposed framework, we tested our thermal image stitching pipeline against state-of-the-art commercial software. We cost-effectively replicated its functionality while adding a new depth-based modality and created a user interface for temporally monitoring changes in multimodal views of archaeological sites

    Unmanned Aerial Vehicles for High-Throughput Phenotyping and Agronomic Research

    Get PDF
    Advances in automation and data science have led agriculturists to seek real-time, high-quality, high-volume crop data to accelerate crop improvement through breeding and to optimize agronomic practices. Breeders have recently gained massive data-collection capability in genome sequencing of plants. Faster phenotypic trait data collection and analysis relative to genetic data leads to faster and better selections in crop improvement. Furthermore, faster and higher-resolution crop data collection leads to greater capability for scientists and growers to improve precision-agriculture practices on increasingly larger farms; e.g., site-specific application of water and nutrients. Unmanned aerial vehicles (UAVs) have recently gained traction as agricultural data collection systems. Using UAVs for agricultural remote sensing is an innovative technology that differs from traditional remote sensing in more ways than strictly higher-resolution images; it provides many new and unique possibilities, as well as new and unique challenges. Herein we report on processes and lessons learned from year 1-the summer 2015 and winter 2016 growing seasons-of a large multidisciplinary project evaluating UAV images across a range of breeding and agronomic research trials on a large research farm. Included are team and project planning, UAV and sensor selection and integration, and data collection and analysis workflow. The study involved many crops and both breeding plots and agronomic fields. The project's goal was to develop methods for UAVs to collect high-quality, high-volume crop data with fast turnaround time to field scientists. The project included five teams: Administration, Flight Operations, Sensors, Data Management, and Field Research. Four case studies involving multiple crops in breeding and agronomic applications add practical descriptive detail. Lessons learned include critical information on sensors, air vehicles, and configuration parameters for both. As the first and most comprehensive project of its kind to date, these lessons are particularly salient to researchers embarking on agricultural research with UAVs
    • …
    corecore