224 research outputs found

    Supervised / unsupervised change detection

    Get PDF
    The aim of this deliverable is to provide an overview of the state of the art in change detection techniques and a critique of what could be programmed to derive SENSUM products. It is the product of the collaboration between UCAM and EUCENTRE. The document includes as a necessary requirement a discussion about a proposed technique for co-registration. Since change detection techniques require an assessment of a series of images and the basic process involves comparing and contrasting the similarities and differences to essentially spot changes, co-registration is the first step. This ensures that the user is comparing like for like. The developed programs would then be used on remotely sensed images for applications in vulnerability assessment and post-disaster recovery assessment and monitoring. One key criterion is to develop semi-automated and automated techniques. A series of available techniques are presented along with the advantages and disadvantages of each method. The descriptions of the implemented methods are included in the deliverable D2.7 ”Software Package SW2.3”. In reviewing the available change detection techniques, the focus was on ways to exploit medium resolution imagery such as Landsat due to its free-to-use license and since there is a rich historical coverage arising from this satellite series. Regarding the change detection techniques with high resolution images, this was also examined and a recovery specific change detection index is discussed in the report

    A MODIS/ASTER airborne simulator (MASTER) imagery for urban heat island research

    Get PDF
    Thermal imagery is widely used to quantify land surface temperatures to monitor the spatial extent and thermal intensity of the urban heat island (UHI) effect. Previous research has applied Landsat images, Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) images, Moderate Resolution Imaging Spectroradiometer (MODIS) images, and other coarse- to medium-resolution remotely sensed imagery to estimate surface temperature. These data are frequently correlated with vegetation, impervious surfaces, and temperature to quantify the drivers of the UHI effect. Because of the coarse- to medium-resolution of the thermal imagery, researchers are unable to correlate these temperature data with the more generally available high-resolution land cover classification, which are derived from high-resolution multispectral imagery. The development of advanced thermal sensors with very high-resolution thermal imagery such as the MODIS/ASTER airborne simulator (MASTER) has investigators quantifying the relationship between detailed land cover and land surface temperature. While this is an obvious next step, the published literature, i.e., the MASTER data, are often used to discriminate burned areas, assess fire severity, and classify urban land cover. Considerably less attention is given to use MASTER data in the UHI research. We demonstrate here that MASTER data in combination with high-resolution multispectral data has made it possible to monitor and model the relationship between temperature and detailed land cover such as building rooftops, residential street pavements, and parcel-based landscaping. Here, we report on data sources to conduct this type of UHI research and endeavor to intrigue researchers and scientists such that high-resolution airborne thermal imagery is used to further explore the UHI effect

    A MODIS/ASTER Airborne Simulator (MASTER) Imagery for Urban Heat Island Research

    Get PDF
    abstract: Thermal imagery is widely used to quantify land surface temperatures to monitor the spatial extent and thermal intensity of the urban heat island (UHI) effect. Previous research has applied Landsat images, Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) images, Moderate Resolution Imaging Spectroradiometer (MODIS) images, and other coarse- to medium-resolution remotely sensed imagery to estimate surface temperature. These data are frequently correlated with vegetation, impervious surfaces, and temperature to quantify the drivers of the UHI effect. Because of the coarse- to medium-resolution of the thermal imagery, researchers are unable to correlate these temperature data with the more generally available high-resolution land cover classification, which are derived from high-resolution multispectral imagery. The development of advanced thermal sensors with very high-resolution thermal imagery such as the MODIS/ASTER airborne simulator (MASTER) has investigators quantifying the relationship between detailed land cover and land surface temperature. While this is an obvious next step, the published literature, i.e., the MASTER data, are often used to discriminate burned areas, assess fire severity, and classify urban land cover. Considerably less attention is given to use MASTER data in the UHI research. We demonstrate here that MASTER data in combination with high-resolution multispectral data has made it possible to monitor and model the relationship between temperature and detailed land cover such as building rooftops, residential street pavements, and parcel-based landscaping. Here, we report on data sources to conduct this type of UHI research and endeavor to intrigue researchers and scientists such that high-resolution airborne thermal imagery is used to further explore the UHI effect

    Texture Extraction Techniques for the Classification of Vegetation Species in Hyperspectral Imagery: Bag of Words Approach Based on Superpixels

    Get PDF
    Texture information allows characterizing the regions of interest in a scene. It refers to the spatial organization of the fundamental microstructures in natural images. Texture extraction has been a challenging problem in the field of image processing for decades. In this paper, different techniques based on the classic Bag of Words (BoW) approach for solving the texture extraction problem in the case of hyperspectral images of the Earth surface are proposed. In all cases the texture extraction is performed inside regions of the scene called superpixels and the algorithms profit from the information available in all the bands of the image. The main contribution is the use of superpixel segmentation to obtain irregular patches from the images prior to texture extraction. Texture descriptors are extracted from each superpixel. Three schemes for texture extraction are proposed: codebook-based, descriptor-based, and spectral-enhanced descriptor-based. The first one is based on a codebook generator algorithm, while the other two include additional stages of keypoint detection and description. The evaluation is performed by analyzing the results of a supervised classification using Support Vector Machines (SVM), Random Forest (RF), and Extreme Learning Machines (ELM) after the texture extraction. The results show that the extraction of textures inside superpixels increases the accuracy of the obtained classification map. The proposed techniques are analyzed over different multi and hyperspectral datasets focusing on vegetation species identification. The best classification results for each image in terms of Overall Accuracy (OA) range from 81.07% to 93.77% for images taken at a river area in Galicia (Spain), and from 79.63% to 95.79% for a vast rural region in China with reasonable computation timesThis work was supported in part by the Civil Program UAVs Initiative, promoted by the Xunta de Galicia and developed in partnership with the Babcock Company to promote the use of unmanned technologies in civil services. We also have to acknowledge the support by Ministerio de Ciencia e Innovación, Government of Spain (grant number PID2019-104834GB-I00), and Consellería de Educación, Universidade e Formación Profesional (ED431C 2018/19, and accreditation 2019-2022 ED431G-2019/04). All are cofunded by the European Regional Development Fund (ERDF)S
    corecore