3,668 research outputs found

    A multi-temporal phenology based classification approach for Crop Monitoring in Kenya

    Get PDF
    The SBAM (Satellite Based Agricultural Monitoring) project, funded by the Italian Space Agency aims at: developing a validated satellite imagery based method for estimating and updating the agricultural areas in the region of Central-Africa; implementing an automated process chain capable of providing periodical agricultural land cover maps of the area of interest and, possibly, an estimate of the crop yield. The project aims at filling the gap existing in the availability of high spatial resolution maps of the agricultural areas of Kenya. A high spatial resolution land cover map of Central-Eastern Africa including Kenya was compiled in the year 2000 in the framework of the Africover project using Landsat images acquired, mostly, in 1995. We investigated the use of phenological information in supporting the use of remotely sensed images for crop classification and monitoring based on Landsat 8 and, in the near future, Sentinel 2 imagery. Phenological information on crop condition was collected using time series of NDVI (Normalized Difference Vegetation Index) based on Landsat 8 images. Kenyan countryside is mainly characterized by a high number of fragmented small and medium size farmlands that dramatically increase the difficulty in classification; 30 m spatial resolution images are not enough for a proper classification of such areas. So, a pan-sharpening FIHS (Fast Intensity Hue Saturation) technique was implemented to increase image resolution from 30 m to 15 m. Ground test sites were selected, searching for agricultural vegetated areas from which phenological information was extracted. Therefore, the classification of agricultural areas is based on crop phenology, vegetation index behaviour retrieved from a time series of satellite images and on AEZ (Agro Ecological Zones) information made available by FAO (FAO, 1996) for the area of interest. This paper presents the results of the proposed classification procedure in comparison with land cover maps produced in the past years by other projects. The results refer to the Nakuru County and they were validated using field campaigns data. It showed a satisfactory overall accuracy of 92.66 % which is a significant improvement with respect to previous land cover maps

    TaLAM: Mapping Land Cover in Lowlands and Uplands with Satellite Imagery

    Get PDF
    End-of-Project ReportThe Towards Land Cover Accounting and Monitoring (TaLAM) project is part of Ireland’s response to creating a national land cover mapping programme. Its aims are to demonstrate how the new digital map of Ireland, Prime2, from Ordnance Survey Ireland (OSI), can be combined with satellite imagery to produce land cover maps

    A systematic review of the use of Deep Learning in Satellite Imagery for Agriculture

    Full text link
    Agricultural research is essential for increasing food production to meet the requirements of an increasing population in the coming decades. Recently, satellite technology has been improving rapidly and deep learning has seen much success in generic computer vision tasks and many application areas which presents an important opportunity to improve analysis of agricultural land. Here we present a systematic review of 150 studies to find the current uses of deep learning on satellite imagery for agricultural research. Although we identify 5 categories of agricultural monitoring tasks, the majority of the research interest is in crop segmentation and yield prediction. We found that, when used, modern deep learning methods consistently outperformed traditional machine learning across most tasks; the only exception was that Long Short-Term Memory (LSTM) Recurrent Neural Networks did not consistently outperform Random Forests (RF) for yield prediction. The reviewed studies have largely adopted methodologies from generic computer vision, except for one major omission: benchmark datasets are not utilised to evaluate models across studies, making it difficult to compare results. Additionally, some studies have specifically utilised the extra spectral resolution available in satellite imagery, but other divergent properties of satellite images - such as the hugely different scales of spatial patterns - are not being taken advantage of in the reviewed studies.Comment: 25 pages, 2 figures and lots of large tables. Supplementary materials section included here in main pd

    Effect of the Red-Edge Band from Drone Altum Multispectral Camera in Mapping the Canopy Cover of Winter Wheat, Chickweed, and Hairy Buttercup

    Get PDF
    The detection and mapping of winter wheat and the canopy cover of associated weeds, such as chickweed and hairy buttercup, are essential for crop and weed management. With emerging drone technologies, the use of a multispectral camera with the red-edge band, such as Altum, is commonly used for crop and weed mapping. However, little is understood about the contribution of the red-edge band in mapping. The aim of this study was to examine the addition of the red-edge band from a drone with an Altum multispectral camera in improving the detection and mapping of the canopy cover of winter wheat, chickweed, and hairy buttercup. The canopy cover of winter wheat, chickweed, and hairy buttercup were classified and mapped with the red-edge band inclusively and exclusively using a random forest classification algorithm. Results showed that the addition of the red-edge band increased the overall mapping accuracy of about 7%. Furthermore, the red-edge wavelength was found to better detect winter wheat relative to chickweed and hairy buttercup. This study demonstrated the usefulness of the red-edge band in improving the detection and mapping of winter wheat and associated weeds (chickweed and hairy buttercup) in agricultural fields

    OBSUM: An object-based spatial unmixing model for spatiotemporal fusion of remote sensing images

    Full text link
    Spatiotemporal fusion aims to improve both the spatial and temporal resolution of remote sensing images, thus facilitating time-series analysis at a fine spatial scale. However, there are several important issues that limit the application of current spatiotemporal fusion methods. First, most spatiotemporal fusion methods are based on pixel-level computation, which neglects the valuable object-level information of the land surface. Moreover, many existing methods cannot accurately retrieve strong temporal changes between the available high-resolution image at base date and the predicted one. This study proposes an Object-Based Spatial Unmixing Model (OBSUM), which incorporates object-based image analysis and spatial unmixing, to overcome the two abovementioned problems. OBSUM consists of one preprocessing step and three fusion steps, i.e., object-level unmixing, object-level residual compensation, and pixel-level residual compensation. OBSUM can be applied using only one fine image at the base date and one coarse image at the prediction date, without the need of a coarse image at the base date. The performance of OBSUM was compared with five representative spatiotemporal fusion methods. The experimental results demonstrated that OBSUM outperformed other methods in terms of both accuracy indices and visual effects over time-series. Furthermore, OBSUM also achieved satisfactory results in two typical remote sensing applications. Therefore, it has great potential to generate accurate and high-resolution time-series observations for supporting various remote sensing applications
    • …
    corecore