479 research outputs found

    Unmanned Aerial Vehicles (UAVs) in environmental biology: A Review

    Get PDF
    Acquiring information about the environment is a key step during each study in the field of environmental biology at different levels, from an individual species to community and biome. However, obtaining information about the environment is frequently difficult because of, for example, the phenological timing, spatial distribution of a species or limited accessibility of a particular area for the field survey. Moreover, remote sensing technology, which enables the observation of the Earth’s surface and is currently very common in environmental research, has many limitations such as insufficient spatial, spectral and temporal resolution and a high cost of data acquisition. Since the 1990s, researchers have been exploring the potential of different types of unmanned aerial vehicles (UAVs) for monitoring Earth’s surface. The present study reviews recent scientific literature dealing with the use of UAV in environmental biology. Amongst numerous papers, short communications and conference abstracts, we selected 110 original studies of how UAVs can be used in environmental biology and which organisms can be studied in this manner. Most of these studies concerned the use of UAV to measure the vegetation parameters such as crown height, volume, number of individuals (14 studies) and quantification of the spatio-temporal dynamics of vegetation changes (12 studies). UAVs were also frequently applied to count birds and mammals, especially those living in the water. Generally, the analytical part of the present study was divided into following sections: (1) detecting, assessing and predicting threats on vegetation, (2) measuring the biophysical parameters of vegetation, (3) quantifying the dynamics of changes in plants and habitats and (4) population and behaviour studies of animals. At the end, we also synthesised all the information showing, amongst others, the advances in environmental biology because of UAV application. Considering that 33% of studies found and included in this review were published in 2017 and 2018, it is expected that the number and variety of applications of UAVs in environmental biology will increase in the future

    Detection of irrigation inhomogeneities in an olive grove using the NDRE vegetation index obtained from UAV images

    Get PDF
    We have developed a simple photogrammetric method to identify heterogeneous areas of irrigated olive groves and vineyard crops using a commercial multispectral camera mounted on an unmanned aerial vehicle (UAV). By comparing NDVI, GNDVI, SAVI, and NDRE vegetation indices, we find that the latter shows irrigation irregularities in an olive grove not discernible with the other indices. This may render the NDRE as particularly useful to identify growth inhomogeneities in crops. Given the fact that few satellite detectors are sensible in the red-edge (RE) band and none with the spatial resolution offered by UAVs, this finding has the potential of turning UAVs into a local farmer’s favourite aid tool.Peer ReviewedPostprint (published version

    Advancements in Multi-temporal Remote Sensing Data Analysis Techniques for Precision Agriculture

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen

    Integrating uavs and canopy height models in vineyard management: A time-space approach

    Get PDF
    The present study illustrates an operational approach estimating individual and aggregate vineyards’ canopy volume estimation through three years Tree-Row-Volume (TRV) measurements and remotely sensed imagery acquired with unmanned aerial vehicle (UAV) Red-Green-Blue (RGB) digital camera, processed with MATLAB scripts, and validated through ArcGIS tools. The TRV methodology was applied by sampling a different number of rows and plants (per row) each year with the aim of evaluating reliability and accuracy of this technique compared with a remote approach. The empirical results indicate that the estimated tree-row-volumes derived from a UAV Canopy Height Model (CHM) are up to 50% different from those measured on the field using the routinary technique of TRV in 2019. The difference is even much higher in the two 2016 dates. These empirical findings outline the importance of data integration among techniques that mix proximal and remote sensing in routine vineyards’ agronomic practices, helping to reduce management costs and increase the environmental sustainability of traditional cultivation systems

    Classification of 3D Point Clouds Using Color Vegetation Indices for Precision Viticulture and Digitizing Applications

    Get PDF
    Remote sensing applied in the digital transformation of agriculture and, more particularly, in precision viticulture offers methods to map field spatial variability to support site-specific management strategies; these can be based on crop canopy characteristics such as the row height or vegetation cover fraction, requiring accurate three-dimensional (3D) information. To derive canopy information, a set of dense 3D point clouds was generated using photogrammetric techniques on images acquired by an RGB sensor onboard an unmanned aerial vehicle (UAV) in two testing vineyards on two different dates. In addition to the geometry, each point also stores information from the RGB color model, which was used to discriminate between vegetation and bare soil. To the best of our knowledge, the new methodology herein presented consisting of linking point clouds with their spectral information had not previously been applied to automatically estimate vine height. Therefore, the novelty of this work is based on the application of color vegetation indices in point clouds for the automatic detection and classification of points representing vegetation and the later ability to determine the height of vines using as a reference the heights of the points classified as soil. Results from on-ground measurements of the heights of individual grapevines were compared with the estimated heights from the UAV point cloud, showing high determination coefficients (R² > 0.87) and low root-mean-square error (0.070 m). This methodology offers new capabilities for the use of RGB sensors onboard UAV platforms as a tool for precision viticulture and digitizing applications

    Monitorización 3D de cultivos y cartografía de malas hierbas mediante vehículos aéreos no tripulados para un uso sostenible de fitosanitarios

    Get PDF
    En esta Tesis Doctoral se han utilizado las imágenes procedentes de un UAV para abordar la sostenibilidad de la aplicación de productos fitosanitarios mediante la generación de mapas que permitan su aplicación localizada. Se han desarrollado dos formas diferentes y complementarias para lograr este objetivo: 1) la reducción de la aplicación de herbicidas en post-emergencia temprana mediante el diseño de tratamientos dirigidos a las zonas infestadas por malas hierbas en varios cultivos herbáceos; y 2) la caracterización tridimensional (arquitectura y volumen) de cultivos leñosos para el diseño de tratamientos de aplicación localizada de fitosanitarios dirigidos a la parte aérea de los mismos. Para afrontar el control localizado de herbicidas se han estudiado la configuración y las especificaciones técnicas de un UAV y de los sensores embarcados a bordo para su aplicación en la detección temprana de malas hierbas y contribuir a la generación de mapas para un control localizado en tres cultivos herbáceos: maíz, trigo y girasol. A continuación, se evaluaron los índices espectrales más precisos para su uso en la discriminación de suelo desnudo y vegetación (cultivo y malas hierbas) en imágenes-UAV tomadas sobre dichos cultivos en fase temprana. Con el fin de automatizar dicha discriminación se implementó en un entorno OBIA un método de cálculo de umbrales. Finalmente, se desarrolló una metodología OBIA automática y robusta para la discriminación de cultivo, suelo desnudo y malas hierbas en los tres cultivos estudiados, y se evaluó la influencia sobre su funcionamiento de distintos parámetros relacionados con la toma de imágenes UAV (solape, tipo de sensor, altitud de vuelo, momento de programación de los vuelos, entre otros). Por otra parte y para facilitar el diseño de tratamientos fitosanitarios ajustados a las necesidades de los cultivos leñosos se ha desarrollado una metodología OBIA automática y robusta para la caracterización tridimensional (arquitectura y volumen) de cultivos leñosos usando imágenes y modelos digitales de superficies generados a partir de imágenes procedentes de un UAV. Asimismo, se evaluó la influencia de distintos parámetros relacionados con la toma de las imágenes (solape, tipo de sensor, altitud de vuelo) sobre el funcionamiento del algoritmo OBIA diseñado

    Detection and segmentation of vine canopy in ultra-high spatial resolution RGB imagery obtained from unmanned aerial vehicle (UAV): a case study in a commercial vineyard

    Get PDF
    The use of Unmanned Aerial Vehicles (UAVs) in viticulture permits the capture of aerial Red-Green-Blue (RGB) images with an ultra-high spatial resolution. Recent studies have demonstrated that RGB images can be used to monitor spatial variability of vine biophysical parameters. However, for estimating these parameters, accurate and automated segmentation methods are required to extract relevant information from RGB images. Manual segmentation of aerial images is a laborious and time-consuming process. Traditional classification methods have shown satisfactory results in the segmentation of RGB images for diverse applications and surfaces, however, in the case of commercial vineyards, it is necessary to consider some particularities inherent to canopy size in the vertical trellis systems (VSP) such as shadow effect and different soil conditions in inter-rows (mixed information of soil and weeds). Therefore, the objective of this study was to compare the performance of four classification methods (K-means, Artificial Neural Networks (ANN), Random Forest (RForest) and Spectral Indices (SI)) to detect canopy in a vineyard trained on VSP. Six flights were carried out from post-flowering to harvest in a commercial vineyard cv. Carménère using a low-cost UAV equipped with a conventional RGB camera. The results show that the ANN and the simple SI method complemented with the Otsu method for thresholding presented the best performance for the detection of the vine canopy with high overall accuracy values for all study days. Spectral indices presented the best performance in the detection of Plant class (Vine canopy) with an overall accuracy of around 0.99. However, considering the performance pixel by pixel, the Spectral indices are not able to discriminate between Soil and Shadow class. The best performance in the classification of three classes (Plant, Soil, and Shadow) of vineyard RGB images, was obtained when the SI values were used as input data in trained methods (ANN and RForest), reaching overall accuracy values around 0.98 with high sensitivity values for the three classes

    Local Motion Planner for Autonomous Navigation in Vineyards with a RGB-D Camera-Based Algorithm and Deep Learning Synergy

    Get PDF
    With the advent of agriculture 3.0 and 4.0, researchers are increasingly focusing on the development of innovative smart farming and precision agriculture technologies by introducing automation and robotics into the agricultural processes. Autonomous agricultural field machines have been gaining significant attention from farmers and industries to reduce costs, human workload, and required resources. Nevertheless, achieving sufficient autonomous navigation capabilities requires the simultaneous cooperation of different processes; localization, mapping, and path planning are just some of the steps that aim at providing to the machine the right set of skills to operate in semi-structured and unstructured environments. In this context, this study presents a low-cost local motion planner for autonomous navigation in vineyards based only on an RGB-D camera, low range hardware, and a dual layer control algorithm. The first algorithm exploits the disparity map and its depth representation to generate a proportional control for the robotic platform. Concurrently, a second back-up algorithm, based on representations learning and resilient to illumination variations, can take control of the machine in case of a momentaneous failure of the first block. Moreover, due to the double nature of the system, after initial training of the deep learning model with an initial dataset, the strict synergy between the two algorithms opens the possibility of exploiting new automatically labeled data, coming from the field, to extend the existing model knowledge. The machine learning algorithm has been trained and tested, using transfer learning, with acquired images during different field surveys in the North region of Italy and then optimized for on-device inference with model pruning and quantization. Finally, the overall system has been validated with a customized robot platform in the relevant environment

    Remote sensing image fusion on 3D scenarios: A review of applications for agriculture and forestry

    Get PDF
    Three-dimensional (3D) image mapping of real-world scenarios has a great potential to provide the user with a more accurate scene understanding. This will enable, among others, unsupervised automatic sampling of meaningful material classes from the target area for adaptive semi-supervised deep learning techniques. This path is already being taken by the recent and fast-developing research in computational fields, however, some issues related to computationally expensive processes in the integration of multi-source sensing data remain. Recent studies focused on Earth observation and characterization are enhanced by the proliferation of Unmanned Aerial Vehicles (UAV) and sensors able to capture massive datasets with a high spatial resolution. In this scope, many approaches have been presented for 3D modeling, remote sensing, image processing and mapping, and multi-source data fusion. This survey aims to present a summary of previous work according to the most relevant contributions for the reconstruction and analysis of 3D models of real scenarios using multispectral, thermal and hyperspectral imagery. Surveyed applications are focused on agriculture and forestry since these fields concentrate most applications and are widely studied. Many challenges are currently being overcome by recent methods based on the reconstruction of multi-sensorial 3D scenarios. In parallel, the processing of large image datasets has recently been accelerated by General-Purpose Graphics Processing Unit (GPGPU) approaches that are also summarized in this work. Finally, as a conclusion, some open issues and future research directions are presented.European Commission 1381202-GEU PYC20-RE-005-UJA IEG-2021Junta de Andalucia 1381202-GEU PYC20-RE-005-UJA IEG-2021Instituto de Estudios GiennesesEuropean CommissionSpanish Government UIDB/04033/2020DATI-Digital Agriculture TechnologiesPortuguese Foundation for Science and Technology 1381202-GEU FPU19/0010
    • …
    corecore