183 research outputs found

    Evaluation of optical and synthetic aperture radar image fusion methods: a case study applied to Sentinel imagery

    Get PDF
    This paper evaluates different optical and synthetic aperture radar (SAR) image fusion methods applied to open-access Sentinel images with global coverage. The objective of this research was to evaluate the potential of image fusion methods to get a greater visual difference in land cover, especially in oil palm crops with natural forest areas that are difficult to differentiate visually. The application of the image fusion methods: Brovey (BR), high-frequency modulation (HFM), Gram-Schmidt (GS), and principal components (PC) was evaluated on Sentinel-2 optical and Sentinel-1 SAR images using a cloud computing environment. The results show that the application of the implemented optical/SAR image fusion methods allows the creation of a synthetic image with the characteristics of both data sources. The multispectral information provided by the optical image and information associated with the geometry and texture/roughness of the land covers, provided by the SAR image, allows a greater differentiation in the visualization of the various land covers, achieving a better understanding of the study area. The fusion methods that visually presented greater characteristics associated with the SAR image were the BR and GS methods. The HFM method reached the best statistical indicators; however, this method did not present significant visual changes in the SAR contribution

    Fusion of Multisource Images for Update of Urban GIS

    Get PDF

    The role of multispectral image transformations in change detection

    Get PDF
    In recent decades, remote sensing techniques have been applied as a powerful tool to provide temporal variation of Earth related phenomena. To understand the impact of climate change and human activities on Earth water resources, monitoring the variation of water storage over a long period is a primary issue. On the other hand, this variation is fundamental to estimate the hydroelectric power generation variation and fresh water recreation. Among the spaceborne sensors, optical and SAR satellite imagery provide the opportunity to monitor the spatial change in coastline, which can serve as a way to determine the water extent repeatedly in an appropriate time interval. While water absorbs nearly all the sunlight in near-infrared wavelength, the water bodies appear very dark at this band in an optical image. So applying a threshold on the image histogram is a common way to build the water mask. Despite its straightforward procedure, precise distinctions among water bodies may not be possible in some regions or seasons because of the complicated relationship between water and land and also because of the effect of vegetation. As well as thresholding, other change detection method are widely used to monitor the extent of water bodies. Multispectral transformation analyses like PCA and CCA are able to highlight the important information about the change in all spectral bands and also to reduce the dimension of data. In this way, their potential to improve the quality of satellite images and also reduce the noise level must be assessed. In this thesis, we have two general objectives. First improving the quality of the multispectral image applying PCA on spectral bands and then reconstructing the image using just a certain number of PCs. Number of the PCs and the selecting strategy appropriate PCs are the main challenge of this procedure. Highlighting the change between two multispectral images applying transformation like PCA and CCA is the other objective. Interpreting the transformed images are not straightforward and in most cases, comparing with the original images could be a solution. We examine different scenarios to examine the performance of the spectral transformations in change detection

    Multisource and Multitemporal Data Fusion in Remote Sensing

    Get PDF
    The sharp and recent increase in the availability of data captured by different sensors combined with their considerably heterogeneous natures poses a serious challenge for the effective and efficient processing of remotely sensed data. Such an increase in remote sensing and ancillary datasets, however, opens up the possibility of utilizing multimodal datasets in a joint manner to further improve the performance of the processing approaches with respect to the application at hand. Multisource data fusion has, therefore, received enormous attention from researchers worldwide for a wide variety of applications. Moreover, thanks to the revisit capability of several spaceborne sensors, the integration of the temporal information with the spatial and/or spectral/backscattering information of the remotely sensed data is possible and helps to move from a representation of 2D/3D data to 4D data structures, where the time variable adds new information as well as challenges for the information extraction algorithms. There are a huge number of research works dedicated to multisource and multitemporal data fusion, but the methods for the fusion of different modalities have expanded in different paths according to each research community. This paper brings together the advances of multisource and multitemporal data fusion approaches with respect to different research communities and provides a thorough and discipline-specific starting point for researchers at different levels (i.e., students, researchers, and senior researchers) willing to conduct novel investigations on this challenging topic by supplying sufficient detail and references

    Deep Learning based data-fusion methods for remote sensing applications

    Get PDF
    In the last years, an increasing number of remote sensing sensors have been launched to orbit around the Earth, with a continuously growing production of massive data, that are useful for a large number of monitoring applications, especially for the monitoring task. Despite modern optical sensors provide rich spectral information about Earth's surface, at very high resolution, they are weather-sensitive. On the other hand, SAR images are always available also in presence of clouds and are almost weather-insensitive, as well as daynight available, but they do not provide a rich spectral information and are severely affected by speckle "noise" that make difficult the information extraction. For the above reasons it is worth and challenging to fuse data provided by different sources and/or acquired at different times, in order to leverage on their diversity and complementarity to retrieve the target information. Motivated by the success of the employment of Deep Learning methods in many image processing tasks, in this thesis it has been faced different typical remote sensing data-fusion problems by means of suitably designed Convolutional Neural Networks

    Novel pattern recognition methods for classification and detection in remote sensing and power generation applications

    Get PDF
    Novel pattern recognition methods for classification and detection in remote sensing and power generation application

    Mapping And Monitoring Wetland Environment By Analysis Different Satellite Images And Field Spectroscopy

    Get PDF
    Tez (Doktora) -- İstanbul Teknik Üniversitesi, Fen Bilimleri Enstitüsü, 2010Thesis (PhD) -- İstanbul Technical University, Institute of Science and Technology, 2010Bu çalışmada farklı spektral ve mekansal çözünürlükte uydu görüntülerinin “Terkos Havzası Sulak Alanı” örneğinde; arazi örtüsünde meydana gelen değişimleri ve sulak alan bitki türlerinin belirlenmesinde kullanılabilirlikleri için uygulanabilecek uzaktan algılama yöntemleri ele alınmıştır. Kullanılan yöntemler ile elde edilen yeni işlenmiş görüntülerin performanslarının yersel yansıtım değerleri kullanılarak desteklenmesi ile doğal alanların sürdürülebilir korunma ve yönetimi için uzaktan algılama verilerine dayalı bir altlık rehberin oluşturulması imkanı araştırılmıştır. Elde edilen sonuçlara göre heterojen arazi örtüsü yapısına sahip olan çalışma bölgesinde değişim tespiti için Ana Bileşen Dönüşümüne dayalı değişim tespit yöntemi en iyi sonucu vermiştir. Ayrıca bu çalışmada, hiperspektral Hyperion EO-1 görüntüsü ile sulak alan bitki örtüsünün diğer bitki türlerinden doğru olarak ayırt edilebildiği ortaya konmuştur. Sulak alan bitki türlerinin kendi içinde ayırt edilebilmesi ancak yersel spektroskopi ile mümkün olduğu sonucuna ulaşılmıştır.In this study, different satellite data that has different spectral and spatial resolution and in-situ spectroradiometer measurements were used to analyze hydrophytic vegetation and surrounded land cover for sustainable development and conservation of Terkos wetlands. By supporting performances of processed images with field collected reflectance values, the feasibility of structuring a basic guide based on remote sensing data for sustainable preservation and management of natural lands was searched. According to result, land cover changes in the complex natural area were determined more accurately by using PCA based change detection method Therefore, the performance of spaceborne Hyperion EO-1 hyperspectral data was analyzed to determine the capability of the data for wetland vegetation discrimination than the other vegetated areas. At the last stage of the study, field collected reflectance values that have different wetland flora types were compared by statistical ANOVA method and reflectance differences between vegetation types were put forward through calculations.DoktoraPh

    A hierarchical clustering method for land cover change detection and identification

    Get PDF
    A method to detect abrupt land cover changes using hierarchical clustering of multi-temporal satellite imagery was developed. The Autochange method outputs the pre-change land cover class, the change magnitude, and the change type. Pre-change land cover information is transferred to post-change imagery based on classes derived by unsupervised clustering, enabling using data from different instruments for pre- and post-change. The change magnitude and change types are computed by unsupervised clustering of the post-change image within each cluster, and by comparing the mean intensity values of the lower level clusters with their parent cluster means. A computational approach to determine the change magnitude threshold for the abrupt change was developed. The method was demonstrated with three summer image pairs Sentinel-2/Sentinel-2, Landsat 8/Sentinel-2, and Sentinel-2/ALOS 2 PALSAR in a study area of 12,372 km2 in southern Finland for the detection of forest clear cuts and tested with independent data. The Sentinel-2 classification produced an omission error of 5.6% for the cut class and 0.4% for the uncut class. Commission errors were 4.9% for the cut class and 0.4% for the uncut class. For the Landsat 8/Sentinel-2 classifications the equivalent figures were 20.8%, 0.2%, 3.4%, and 1.6% and for the Sentinel-2/ALOS PALSAR classification 16.7%, 1.4%, 17.8%, and 1.3%, respectively. The Autochange algorithm and its software implementation was considered applicable for the mapping of abrupt land cover changes using multi-temporal satellite data. It allowed mixing of images even from the optical and synthetic aperture radar (SAR) sensors in the same change analysis
    corecore