167 research outputs found

    Evaluation of the UAV-Based Multispectral Imagery and Its Application for Crop Intra-Field Nitrogen Monitoring and Yield Prediction in Ontario

    Get PDF
    Unmanned Aerial Vehicle (UAV) has the capability of acquiring high spatial and temporal resolution images. This new technology fills the data gap between satellite and ground survey in agriculture. In addition, UAV-based crop monitoring and methods are new challenge of remote sensing application in agriculture. First, in my thesis the potential of UAV-based imagery was investigated to monitor spatial and temporal variation of crop status in comparison with RapidEye. The correlation between red-edge indices and LAI and biomass are higher for UAV-based imagery than that of RapidEye. Secondly, the nitrogen weight and yield in wheat was predicted using the UAV-based imagery. The intra-field nitrogen prediction model performs well at wheat early growth stage. Additionally, the best data collection time for yield prediction is at the end of booting stage. The results demonstrate the UAV-based data could be an alternative effective and affordable approach for farmers on intra-field management

    Fusion of multispectral imagery and spectrometer data in UAV remote sensing

    Get PDF
    High spatial resolution hyperspectral data often used in precision farming applications are not available from current satellite sensors, and difficult or expensive to acquire from standard aircraft. Alternatively, in precision farming, unmanned aerial vehicles (UAVs) are emerging as lower cost and more flexible means to acquire very high resolution imagery. Miniaturized hyperspectral sensors have been developed for UAVs, but th

    Use of unmanned aircraft systems (UAS) and multispectral imagery for quantifying agricultural areas damaged by wild pigs

    Get PDF
    Wild pigs (Sus scrofa) cause extensive damage to agricultural crops, resulting in lost production and income. A major challenge associated with assessing damage to crops is locating and quantifying damaged areas within agricultural fields. We evaluated a novel method using multispectral high-resolution aerial imagery, collected from sensors mounted on unmanned aircraft systems (UAS), and feature extraction techniques to detect and map areas of corn fields damaged by wild pigs in southern Missouri, USA. Damaged areas were extracted from orthomosaics using visible and near-infrared band combinations, an object-based classification approach, and hierarchical learning cycles. To validate estimates we also collected ground reference data immediately following flights. Overall accuracy of damage estimates to corn fields were similar among band combinations evaluated, ranging from 74% to 98% when using visible and near-infrared information, compared to 72%–94% with visible information alone. By including near-infrared with visible information, though, we found higher average kappa values (0.76) than with visible information (0.60) alone. We demonstrated that UAS are an appropriate platform for collecting high-resolution multispectral imagery of corn fields and that object-oriented classifiers can be effectively used to delineate areas damaged by wild pigs. The proposed approach outlines a new monitoring technique that can efficiently estimate damage to entire corn fields caused by wild pigs and also has potential to be applied to other crop types

    Validation of Digital Surface Models (DSMs) Retrieved From Unmanned Aerial Vehicle (UAV) Point Clouds Using Geometrical Information From Shadows

    Get PDF
    Theoretically, the appearance of shadows in aerial imagery is not desirable for researchers because it leads to errors in object classification and bias in the calculation of indices. In contrast, shadows contain useful geometrical information about the objects blocking the light. Several studies have focused on estimation of building heights in urban areas using the length of shadows. This type of information can be used to predict the population of a region, water demand, etc., in urban areas. With the emergence of unmanned aerial vehicles (UAVs) and the availability of high- to super-high-resolution imagery, the important questions relating to shadows have received more attention. Three-dimensional imagery generated using UAV-based photogrammetric techniques can be very useful, particularly in agricultural applications such as in the development of an empirical equation between biomass or yield and the geometrical information of canopies or crops. However, evaluating the accuracy of the canopy or crop height requires labor-intensive efforts. In contrast, the geometrical relationship between the length of the shadows and the crop or canopy height can be inversely solved using the shadow length measured. In this study, object heights retrieved from UAV point clouds are validated using the geometrical shadow information retrieved from three sets of high-resolution imagery captured by Utah State University’s AggieAir UAV system. These flights were conducted in 2014 and 2015 over a commercial vineyard located in California for the USDA Agricultural Research Service Grape Remote sensing Atmospheric Profile and Evapotranspiration Experiment (GRAPEX) Program. The results showed that, although this approach could be computationally expensive, it is faster than fieldwork and does not require an expensive and accurate instrument such as a real-time kinematic (RTK) GPS

    Application of unmanned aerial systems for high throughput phenotyping of large wheat breeding nurseries

    Get PDF
    Citation: Haghighattalab, A., Perez, L. G., Mondal, S., Singh, D., Schinstock, D., Rutkoski, J., . . . Poland, J. (2016). Application of unmanned aerial systems for high throughput phenotyping of large wheat breeding nurseries. Plant Methods, 12, 15. https://doi.org/10.1186/s13007-016-0134-6Background: Low cost unmanned aerial systems (UAS) have great potential for rapid proximal measurements of plants in agriculture. In the context of plant breeding and genetics, current approaches for phenotyping a large number of breeding lines under field conditions require substantial investments in time, cost, and labor. For field-based high-throughput phenotyping (HTP), UAS platforms can provide high-resolution measurements for small plot research, while enabling the rapid assessment of tens-of-thousands of field plots. The objective of this study was to complete a baseline assessment of the utility of UAS in assessment field trials as commonly implemented in wheat breeding programs. We developed a semi-automated image-processing pipeline to extract plot level data from UAS imagery. The image dataset was processed using a photogrammetric pipeline based on image orientation and radiometric calibration to produce orthomosaic images. We also examined the relationships between vegetation indices (VIs) extracted from high spatial resolution multispectral imagery collected with two different UAS systems (eBee Ag carrying MultiSpec 4C camera, and IRIS+ quadcopter carrying modified NIR Canon S100) and ground truth spectral data from hand-held spectroradiometer. Results: We found good correlation between the VIs obtained from UAS platforms and ground-truth measurements and observed high broad-sense heritability for VIs. We determined radiometric calibration methods developed for satellite imagery significantly improved the precision of VIs from the UAS. We observed VIs extracted from calibrated images of Canon S100 had a significantly higher correlation to the spectroradiometer (r = 0.76) than VIs from the MultiSpec 4C camera (r = 0.64). Their correlation to spectroradiometer readings was as high as or higher than repeated measurements with the spectroradiometer per se. Conclusion: The approaches described here for UAS imaging and extraction of proximal sensing data enable collection of HTP measurements on the scale and with the precision needed for powerful selection tools in plant breeding. Low-cost UAS platforms have great potential for use as a selection tool in plant breeding programs. In the scope of tools development, the pipeline developed in this study can be effectively employed for other UAS and also other crops planted in breeding nurseries

    Framework for semi-automated object-based image classification of invasive alien plant species in South Africa: Harrisia Pomanensis as a case study

    Get PDF
    Invasive alien plants (IAPs) not only pose a serious threat to biodiversity and water resources but also have impacts on human and animal wellbeing. An important step in IAPs management is to map their location as there is a strong correlation between the spatial extent of an invaded area and the effort required for clearing the plant invasion. However, the traditional GPS based IAPs mapping field campaigns are costly, time consuming and labour intensive. The developments in the Unmanned Aerial Vehicle (UAV) technology have afforded the remote sensing (RS) community the opportunity to map IAPs at enhanced temporal and spatial resolutions. As a result, this framework synthesises a UAV-RS approach for mapping invasive alien plants in South African semi-arid woodlands using Harrisia pomanensis (the Midnight lady) as a case study. In particular, this framework outlines procedures for geometric and radiometric calibration of UAV-derived orthomosaics as well a semi-automated object-based image classification technique for mapping IAPs. The geometric calibration was conducted in the Agisoft Lens software package to determine the camera interior orientation parameters. Since sample photos of the LCD screen were taken from a short-range, there were more radial than tangential distortions. In addition, a scene illumination uniformity statistical inference allowed for the radiometric calibration of the entire scene using parameters derived from radiometric calibration targets placed only in one spot within the study area using the empirical line method (ELM). In particular, accuracy assessment of the radiometric calibration resulted in a correlation coefficient (r) value of 0.977 between in situ measured reflectance and the reflectance values derived from the calibrated image wavebands. This strong correlation validated the proposed UAV-RS ELM based radiometric calibration method for applications in semi-arid woodlands. Furthermore, out of the five evaluated image classifiers, the case study demonstrated that the object-based supervised Bhattacharya classifier which gave 90% and 95.7% producer and user accuracies, respectively, produced more accurate results for mapping Harrisia pomanensis. Even more so, an area based accuracy assessment showed that the Bhattacharya classifier mapped Harrisia pomanensis better than the Maxver classifier (i.e. the second best algorithm) with mapping accuracy averages of 86.1% and 65.2%, respectively, for all the different polygon area sizes. Future research should ascertain whethe radiometric calibration increases mapping accuracy in large scale (>100ha) UAV-RS applications.Dissertation (MSc)--University of Pretoria, 2018.Geography, Geoinformatics and MeteorologyMScUnrestricte

    Vegetation Mapping of a Coastal Dune Complex Using Multispectral Imagery Acquired from an Unmanned Aerial System

    Get PDF
    Vegetation mapping, identifying the distribution of plant species, is important for analysing vegetation dynamics, quantifying spatial patterns of vegetation evolution, analysing the effects of environment changes on vegetation, and predicting spatial patterns of species diversity. Such analysis can contribute to the development of targeted land management actions that maintain biodiversity and ecological functions. This paper represents a methodology for 3D vegetation mapping of a coastal dune complex using a multispectral camera mounted on an Unmanned Aerial System (UAS) with particular reference to the Buckroney dune complex in Co. Wicklow, Ireland. UAS, also known as Unmanned Aerial Vehicles (UAV’s) or drones, have enabled high-resolution and high-accuracy ground-based data to be gathered quickly and easily on-site. The Sequoia multispectral camera used in this study has green, red, red-edge and near infrared wavebands, and a normal RGB camera, to capture both visible and NIR images of the land surface. The workflow of 3D vegetation mapping of the study site included establishing ground control points, planning the flight mission and camera parameters, acquiring the imagery, processing the image data and performing features classification. The data processing outcomes include an orthomosiac model, a 3D surface model and multispectral images of the study site, in the Irish Transverse Mercator coordinate system, with a planimetric resolution of 0.024m and a georeferenced Root-Mean-Square (RMS) error of 0.111m. There were 235 sample area (1m×1m) used for the accuracy assessment of the classification of the vegetation mapping. Feature classification was conducted using three different classification strategies to examine the efficiency of multispectral sensor data for vegetation mapping. Vegetation type classification accuracies ranged from 60% to 70%. This research illustrates the efficiency of data collection at Buckroney dune complex and the high-accuracy and high-resolution of the vegetation mapping of the site using a multispectral sensor mounted on UAS

    Unmanned Aerial Vehicle (UAV) for monitoring soil erosion in Morocco

    Get PDF
    This article presents an environmental remote sensing application using a UAV that is specifically aimed at reducing the data gap between field scale and satellite scale in soil erosion monitoring in Morocco. A fixed-wing aircraft type Sirius I (MAVinci, Germany) equipped with a digital system camera (Panasonic) is employed. UAV surveys are conducted over different study sites with varying extents and flying heights in order to provide both very high resolution site-specific data and lower-resolution overviews, thus fully exploiting the large potential of the chosen UAV for multi-scale mapping purposes. Depending on the scale and area coverage, two different approaches for georeferencing are used, based on high-precision GCPs or the UAV’s log file with exterior orientation values respectively. The photogrammetric image processing enables the creation of Digital Terrain Models (DTMs) and ortho-image mosaics with very high resolution on a sub-decimetre level. The created data products were used for quantifying gully and badland erosion in 2D and 3D as well as for the analysis of the surrounding areas and landscape development for larger extents

    Coastal Dune Vegetation Mapping Using a Multispectral Sensor Mounted on an UAS

    Get PDF
    Vegetation mapping, identifying the type and distribution of plant species, is important for analysing vegetation dynamics, quantifying spatial patterns of vegetation evolution, analysing the effectsof environmental changes and predicting spatial patterns of species diversity. Such analysis can contribute to the development of targeted land management actions that maintain biodiversity and ecological functions. This paper presents a methodology for 3D vegetation mapping of a coastal dune complex using a multispectral camera mounted on an unmanned aerial system with particular reference to the Buckroney dune complex in Co. Wicklow, Ireland. Unmanned aerial systems (UAS), also known as unmanned aerial vehicles (UAV) or drones, have enabled high-resolution and high-accuracy ground-based data to be gathered quickly and easily on-site. The Sequoia multispectral sensor used in this study has green, red, red edge and near-infrared wavebands, and a regular camer with red, green and blue wavebands (RGB camera), to capture both visible and near-infrared (NIR) imagery of the land surface. The workflow of 3D vegetation mapping of the study site included establishing coordinated ground control points, planning the flight mission and camera parameters, acquiring the imagery, processing the image data and performing features classification. The data processing outcomes included an orthomosaic model, a 3D surface model and multispectral imagery of the study site, in the Irish Transverse Mercator (ITM) coordinate system. The planimetric resolution of the RGB sensor-based outcomes was 0.024 m while multispectral sensor-based outcomes had a planimetric resolution of 0.096 m. High-resolution vegetation mapping was successfully generated from these data processing outcomes. There were 235 sample areas (1 m 1 m) used for the accuracy assessment of the classification of the vegetation mapping. Feature classification was conducted using nine diferent classification strategies to examine the efficiency of multispectral sensor data for vegetation and contiguous land cover mapping. The nine classification strategies included combinations of spectral bands and vegetation indices. Results show classification accuracies, based on the nine different classification strategies, ranging from 52% to 75%

    Using uncrewed aerial vehicles for identifying the extent of invasive phragmites australis in treatment areas enrolled in an adaptive management program

    Get PDF
    Higher spatial and temporal resolutions of remote sensing data are likely to be useful for ecological monitoring efforts. There are many different treatment approaches for the introduced European genotype of Phragmites australis, and adaptive management principles are being integrated in at least some long-term monitoring efforts. In this paper, we investigated how natural color and a smaller set of near-infrared (NIR) images collected with low-cost uncrewed aerial vehicles (UAVs) could help quantify the aboveground effects of management efforts at 20 sites enrolled in the Phragmites Adaptive Management Framework (PAMF) spanning the coastal Laurentian Great Lakes region. We used object-based image analysis and field ground truth data to classify the Phragmites and other cover types present at each of the sites and calculate the percent cover of Phragmites, including whether it was alive or dead, in the UAV images. The mean overall accuracy for our analysis with natural color data was 91.7% using four standardized classes (Live Phragmites, Dead Phragmites, Other Vegetation, Other Non-vegetation). The Live Phragmites class had a mean user’s accuracy of 90.3% and a mean producer’s accuracy of 90.1%, and the Dead Phragmites class had a mean user’s accuracy of 76.5% and a mean producer’s accuracy of 85.2% (not all classes existed at all sites). These results show that UAV-based imaging and object-based classification can be a useful tool to measure the extent of dead and live Phragmites at a series of sites undergoing management. Overall, these results indicate that UAV sensing appears to be a useful tool for identifying the extent of Phragmites at management sites
    • 

    corecore