413 research outputs found

    A Tracked Mobile Robotic Lab for Monitoring the Plants Volume and Health

    Get PDF
    9noPrecision agriculture has been increasingly recognized for its potential ability to improve agricultural productivity, reduce production cost, and minimize damage to the environment. In this work, the current stage of our research in developing a mobile platform equipped with different sensors for orchard monitoring and sensing is presented. In particular, the mobile platform is conceived to monitor and assess both the geometric and volumetric conditions as well as the health state of the canopy. To do so, different sensors have been integrated and efficient data-processing algorithms implemented for a reliable crop monitoring. Experimental tests have been performed allowing to obtain both a precise volume reconstruction of several plants and an NDVI mapping suitable for vegetation state evaluations.openopenopenBietresato, M; Carabin, G; D’Auria, D; Gallo, R; Gasparetto, A.; Ristorto, G; Mazzetto, F; Vidoni, R; Scalera, L.Bietresato, M; Carabin, G; D’Auria, D; Gallo, R; Gasparetto, Alessandro; Ristorto, G; Mazzetto, F; Vidoni, R; Scalera, Lorenz

    Augmented Perception for Agricultural Robots Navigation

    Full text link
    [EN] Producing food in a sustainable way is becoming very challenging today due to the lack of skilled labor, the unaffordable costs of labor when available, and the limited returns for growers as a result of low produce prices demanded by big supermarket chains in contrast to ever-increasing costs of inputs such as fuel, chemicals, seeds, or water. Robotics emerges as a technological advance that can counterweight some of these challenges, mainly in industrialized countries. However, the deployment of autonomous machines in open environments exposed to uncertainty and harsh ambient conditions poses an important defiance to reliability and safety. Consequently, a deep parametrization of the working environment in real time is necessary to achieve autonomous navigation. This article proposes a navigation strategy for guiding a robot along vineyard rows for field monitoring. Given that global positioning cannot be granted permanently in any vineyard, the strategy is based on local perception, and results from fusing three complementary technologies: 3D vision, lidar, and ultrasonics. Several perception-based navigation algorithms were developed between 2015 and 2019. After their comparison in real environments and conditions, results showed that the augmented perception derived from combining these three technologies provides a consistent basis for outlining the intelligent behavior of agricultural robots operating within orchards.This work was supported by the European Union Research and Innovation Programs under Grant N. 737669 and Grant N. 610953. The associate editor coordinating the review of this article and approving it for publication was Dr. Oleg Sergiyenko.Rovira Más, F.; Sáiz Rubio, V.; Cuenca-Cuenca, A. (2021). Augmented Perception for Agricultural Robots Navigation. IEEE Sensors Journal. 21(10):11712-11727. https://doi.org/10.1109/JSEN.2020.3016081S1171211727211

    Development and Validation of a LiDAR Scanner for 3D Evaluation of Soil Vegetal Coverage

    Get PDF
    Water and wind erosion are serious problems due to the loss of soil productivity. The coverage of soil, by means of cover crops or crops residues, is an effective tool to prevent wind and water erosion. The soil coverage could curb wind on the surface, avoid water runoff and reduce direct soil evaporation. Residue spatial distribution is the main factor to successful soil protection. The current work presents details of a prototype, design and validation as a measuring instrument to sense the height of vegetal crop residues based on a short-ranged laser distance sensor (LiDAR) and a computer numerical control (CNC) mechanism. The results obtained in this work showed a high level of confidence to estimate the height and composition of soil vegetal coverage.Fil: Micheletto, Matías Javier. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Bahía Blanca. Instituto de Investigaciones en Ingeniería Eléctrica "Alfredo Desages". Universidad Nacional del Sur. Departamento de Ingeniería Eléctrica y de Computadoras. Instituto de Investigaciones en Ingeniería Eléctrica "Alfredo Desages"; ArgentinaFil: Zubiaga, Luciano. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Instituto Nacional de Tecnología Agropecuaria. Centro Regional Buenos Aires Sur. Estación Experimental Agropecuaria Hilario Ascasubi; ArgentinaFil: Santos, Rodrigo Martin. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Bahía Blanca. Instituto de Ciencias e Ingeniería de la Computación. Universidad Nacional del Sur. Departamento de Ciencias e Ingeniería de la Computación. Instituto de Ciencias e Ingeniería de la Computación; Argentina. Universidad Nacional del Sur. Departamento de Ingeniería Eléctrica y de Computadoras; ArgentinaFil: Galantini, Juan Alberto. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Bahía Blanca. Centro de Recursos Naturales Renovables de la Zona Semiárida. Universidad Nacional del Sur. Centro de Recursos Naturales Renovables de la Zona Semiárida; Argentina. Provincia de Buenos Aires. Gobernación. Comisión de Investigaciones Científicas; ArgentinaFil: Cantamutto, Miguel Ángel. Instituto Nacional de Tecnología Agropecuaria. Centro Regional Buenos Aires Sur. Estación Experimental Agropecuaria Hilario Ascasubi; ArgentinaFil: Orozco, Javier Dario. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Bahía Blanca. Instituto de Ciencias e Ingeniería de la Computación. Universidad Nacional del Sur. Departamento de Ciencias e Ingeniería de la Computación. Instituto de Ciencias e Ingeniería de la Computación; Argentina. Universidad Nacional del Sur. Departamento de Ingeniería Eléctrica y de Computadoras; Argentin

    Fruit detection in an apple orchard using a mobile terrestrial laser scanner

    Get PDF
    The development of reliable fruit detection and localization systems provides an opportunity to improve the crop value and management by limiting fruit spoilage and optimised harvesting practices. Most proposed systems for fruit detection are based on RGB cameras and thus are affected by intrinsic constraints, such as variable lighting conditions. This work presents a new technique that uses a mobile terrestrial laser scanner (MTLS) to detect and localise Fuji apples. An experimental test focused on Fuji apple trees (Malus domestica Borkh. cv. Fuji) was carried out. A 3D point cloud of the scene was generated using an MTLS composed of a Velodyne VLP-16 LiDAR sensor synchronised with an RTK-GNSS satellite navigation receiver. A reflectance analysis of tree elements was performed, obtaining mean apparent reflectance values of 28.9%, 29.1%, and 44.3% for leaves, branches and trunks, and apples, respectively. These results suggest that the apparent reflectance parameter (at 905 nm wavelength) can be useful to detect apples. For that purpose, a fourstep fruit detection algorithm was developed. By applying this algorithm, a localization success of 87.5%, an identification success of 82.4%, and an F1-score of 0.858 were obtained in relation to the total amount of fruits. These detection rates are similar to those obtained by RGB-based systems, but with the additional advantages of providing direct 3D fruit location information, which is not affected by sunlight variations. From the experimental results, it can be concluded that LiDAR-based technology and, particularly, its reflectance information, has potential for remote apple detection and 3D location.This work was partly funded by the Secretaria d’Universitats i Recerca del Departament d’Empresa i Coneixement de la Generalitat de Catalunya (grant 2017 SGR 646), the Spanish Ministry of Economy and Competitiveness (projects AGL2013-48297-C2-2-Rand MALEGRA, TEC2016-75976-R) and the Spanish Ministry of Science, Innovation and Universities (project RTI2018-094222-B-I00). The Spanish Ministry of Education is thanked for Mr. J. Gené’s pre-doctoral fellowships (FPU15/03355). The work of Jordi Llorens was supported by Spanish Ministry of Economy, Industry and Competitiveness through a postdoctoral position named Juan de la Cierva Incorporación (JDCI-2016-29464_N18003). We would also like to thank CONICYT/FONDECYT for grant 1171431 and CONICYT FB0008. Nufri (especially Santiago Salamero and Oriol Morreres) and Vicens Maquinària Agrícola S.A. are also thanked for their support during the data acquisition

    Assessment of the Accuracy of a Multi-Beam LED Scanner Sensor for Measuring Olive Canopies

    Get PDF
    MDPI. CC BYCanopy characterization has become important when trying to optimize any kind of agricultural operation in high-growing crops, such as olive. Many sensors and techniques have reported satisfactory results in these approaches and in this work a 2D laser scanner was explored for measuring canopy trees in real-time conditions. The sensor was tested in both laboratory and field conditions to check its accuracy, its cone width, and its ability to characterize olive canopies in situ. The sensor was mounted on a mast and tested in laboratory conditions to check: (i) its accuracy at different measurement distances; (ii) its measurement cone width with different reflectivity targets; and (iii) the influence of the target’s density on its accuracy. The field tests involved both isolated and hedgerow orchards, in which the measurements were taken manually and with the sensor. The canopy volume was estimated with a methodology consisting of revolving or extruding the canopy contour. The sensor showed high accuracy in the laboratory test, except for the measurements performed at 1.0 m distance, with 60 mm error (6%). Otherwise, error remained below 20 mm (1% relative error). The cone width depended on the target reflectivity. The accuracy decreased with the target density

    Advanced technologies for the improvement of spray application techniques in spanish viticulture: an overview

    Get PDF
    Spraying techniques have been undergoing continuous evolution in recent decades. This paper presents part of the research work carried out in Spain in the field of sensors for characterizing vineyard canopies and monitoring spray drift in order to improve vineyard spraying and make it more sustainable. Some methods and geostatistical procedures for mapping vineyard parameters are proposed, and the development of a variable rate sprayer is described. All these technologies are interesting in terms of adjusting the amount of pesticides applied to the target canopy.Postprint (published version

    Classification of 3D Point Clouds Using Color Vegetation Indices for Precision Viticulture and Digitizing Applications

    Get PDF
    Remote sensing applied in the digital transformation of agriculture and, more particularly, in precision viticulture offers methods to map field spatial variability to support site-specific management strategies; these can be based on crop canopy characteristics such as the row height or vegetation cover fraction, requiring accurate three-dimensional (3D) information. To derive canopy information, a set of dense 3D point clouds was generated using photogrammetric techniques on images acquired by an RGB sensor onboard an unmanned aerial vehicle (UAV) in two testing vineyards on two different dates. In addition to the geometry, each point also stores information from the RGB color model, which was used to discriminate between vegetation and bare soil. To the best of our knowledge, the new methodology herein presented consisting of linking point clouds with their spectral information had not previously been applied to automatically estimate vine height. Therefore, the novelty of this work is based on the application of color vegetation indices in point clouds for the automatic detection and classification of points representing vegetation and the later ability to determine the height of vines using as a reference the heights of the points classified as soil. Results from on-ground measurements of the heights of individual grapevines were compared with the estimated heights from the UAV point cloud, showing high determination coefficients (R² > 0.87) and low root-mean-square error (0.070 m). This methodology offers new capabilities for the use of RGB sensors onboard UAV platforms as a tool for precision viticulture and digitizing applications
    corecore