4,105 research outputs found

    Recent Advances in Image Restoration with Applications to Real World Problems

    Get PDF
    In the past few decades, imaging hardware has improved tremendously in terms of resolution, making widespread usage of images in many diverse applications on Earth and planetary missions. However, practical issues associated with image acquisition are still affecting image quality. Some of these issues such as blurring, measurement noise, mosaicing artifacts, low spatial or spectral resolution, etc. can seriously affect the accuracy of the aforementioned applications. This book intends to provide the reader with a glimpse of the latest developments and recent advances in image restoration, which includes image super-resolution, image fusion to enhance spatial, spectral resolution, and temporal resolutions, and the generation of synthetic images using deep learning techniques. Some practical applications are also included

    Spatially Enhanced Spectral Unmixing Through Data Fusion of Spectral and Visible Images from Different Sensors

    Get PDF
    Publiher's version (ΓΊtgefin grein)We propose an unmixing framework for enhancing endmember fraction maps using a combination of spectral and visible images. The new method, data fusion through spatial information-aided learning (DFuSIAL), is based on a learning process for the fusion of a multispectral image of low spatial resolution and a visible RGB image of high spatial resolution. Unlike commonly used methods, DFuSIAL allows for fusing data from different sensors. To achieve this objective, we apply a learning process using automatically extracted invariant points, which are assumed to have the same land cover type in both images. First, we estimate the fraction maps of a set of endmembers for the spectral image. Then, we train a spatial-features aided neural network (SFFAN) to learn the relationship between the fractions, the visible bands, and rotation-invariant spatial features for learning (RISFLs) that we extract from the RGB image. Our experiments show that the proposed DFuSIAL method obtains fraction maps with significantly enhanced spatial resolution and an average mean absolute error between 2% and 4% compared to the reference ground truth. Furthermore, it is shown that the proposed method is preferable to other examined state-of-the-art methods, especially when data is obtained from different instruments and in cases with missing-data pixels.This research was partially funded by the Icelandic Research Fund through the EMMIRS project, and bythe Israel Science Ministry and Space Agency through the Venus project.Peer Reviewe

    Deep learning in remote sensing: a review

    Get PDF
    Standing at the paradigm shift towards data-intensive science, machine learning techniques are becoming increasingly important. In particular, as a major breakthrough in the field, deep learning has proven as an extremely powerful tool in many fields. Shall we embrace deep learning as the key to all? Or, should we resist a 'black-box' solution? There are controversial opinions in the remote sensing community. In this article, we analyze the challenges of using deep learning for remote sensing data analysis, review the recent advances, and provide resources to make deep learning in remote sensing ridiculously simple to start with. More importantly, we advocate remote sensing scientists to bring their expertise into deep learning, and use it as an implicit general model to tackle unprecedented large-scale influential challenges, such as climate change and urbanization.Comment: Accepted for publication IEEE Geoscience and Remote Sensing Magazin

    Multisource and Multitemporal Data Fusion in Remote Sensing

    Get PDF
    The sharp and recent increase in the availability of data captured by different sensors combined with their considerably heterogeneous natures poses a serious challenge for the effective and efficient processing of remotely sensed data. Such an increase in remote sensing and ancillary datasets, however, opens up the possibility of utilizing multimodal datasets in a joint manner to further improve the performance of the processing approaches with respect to the application at hand. Multisource data fusion has, therefore, received enormous attention from researchers worldwide for a wide variety of applications. Moreover, thanks to the revisit capability of several spaceborne sensors, the integration of the temporal information with the spatial and/or spectral/backscattering information of the remotely sensed data is possible and helps to move from a representation of 2D/3D data to 4D data structures, where the time variable adds new information as well as challenges for the information extraction algorithms. There are a huge number of research works dedicated to multisource and multitemporal data fusion, but the methods for the fusion of different modalities have expanded in different paths according to each research community. This paper brings together the advances of multisource and multitemporal data fusion approaches with respect to different research communities and provides a thorough and discipline-specific starting point for researchers at different levels (i.e., students, researchers, and senior researchers) willing to conduct novel investigations on this challenging topic by supplying sufficient detail and references

    A Deep Learning Framework in Selected Remote Sensing Applications

    Get PDF
    The main research topic is designing and implementing a deep learning framework applied to remote sensing. Remote sensing techniques and applications play a crucial role in observing the Earth evolution, especially nowadays, where the effects of climate change on our life is more and more evident. A considerable amount of data are daily acquired all over the Earth. Effective exploitation of this information requires the robustness, velocity and accuracy of deep learning. This emerging need inspired the choice of this topic. The conducted studies mainly focus on two European Space Agency (ESA) missions: Sentinel 1 and Sentinel 2. Images provided by the ESA Sentinel-2 mission are rapidly becoming the main source of information for the entire remote sensing community, thanks to their unprecedented combination of spatial, spectral and temporal resolution, as well as their open access policy. The increasing interest gained by these satellites in the research laboratory and applicative scenarios pushed us to utilize them in the considered framework. The combined use of Sentinel 1 and Sentinel 2 is crucial and very prominent in different contexts and different kinds of monitoring when the growing (or changing) dynamics are very rapid. Starting from this general framework, two specific research activities were identified and investigated, leading to the results presented in this dissertation. Both these studies can be placed in the context of data fusion. The first activity deals with a super-resolution framework to improve Sentinel 2 bands supplied at 20 meters up to 10 meters. Increasing the spatial resolution of these bands is of great interest in many remote sensing applications, particularly in monitoring vegetation, rivers, forests, and so on. The second topic of the deep learning framework has been applied to the multispectral Normalized Difference Vegetation Index (NDVI) extraction, and the semantic segmentation obtained fusing Sentinel 1 and S2 data. The S1 SAR data is of great importance for the quantity of information extracted in the context of monitoring wetlands, rivers and forests, and many other contexts. In both cases, the problem was addressed with deep learning techniques, and in both cases, very lean architectures were used, demonstrating that even without the availability of computing power, it is possible to obtain high-level results. The core of this framework is a Convolutional Neural Network (CNN). {CNNs have been successfully applied to many image processing problems, like super-resolution, pansharpening, classification, and others, because of several advantages such as (i) the capability to approximate complex non-linear functions, (ii) the ease of training that allows to avoid time-consuming handcraft filter design, (iii) the parallel computational architecture. Even if a large amount of "labelled" data is required for training, the CNN performances pushed me to this architectural choice.} In our S1 and S2 integration task, we have faced and overcome the problem of manually labelled data with an approach based on integrating these two different sensors. Therefore, apart from the investigation in Sentinel-1 and Sentinel-2 integration, the main contribution in both cases of these works is, in particular, the possibility of designing a CNN-based solution that can be distinguished by its lightness from a computational point of view and consequent substantial saving of time compared to more complex deep learning state-of-the-art solutions

    μ‹œκ³΅κ°„ 해상도 ν–₯상을 ν†΅ν•œ 식생 λ³€ν™” λͺ¨λ‹ˆν„°λ§

    Get PDF
    ν•™μœ„λ…Όλ¬Έ(박사) -- μ„œμšΈλŒ€ν•™κ΅λŒ€ν•™μ› : ν™˜κ²½λŒ€ν•™μ› ν˜‘λ™κ³Όμ • μ‘°κ²½ν•™, 2023. 2. λ₯˜μ˜λ ¬.μœ‘μƒ μƒνƒœκ³„μ—μ„œ λŒ€κΈ°κΆŒκ³Ό μƒλ¬ΌκΆŒμ˜ μƒν˜Έ μž‘μš©μ„ μ΄ν•΄ν•˜κΈ° μœ„ν•΄μ„œλŠ” 식생 λ³€ν™”μ˜ λͺ¨λ‹ˆν„°λ§μ΄ ν•„μš”ν•˜λ‹€. 이 λ•Œ, μœ„μ„±μ˜μƒμ€ μ§€ν‘œλ©΄μ„ κ΄€μΈ‘ν•˜μ—¬ 식생지도λ₯Ό μ œκ³΅ν•  수 μžˆμ§€λ§Œ, μ§€ν‘œλ³€ν™”μ˜ μƒμ„Έν•œ μ •λ³΄λŠ” κ΅¬λ¦„μ΄λ‚˜ μœ„μ„± μ΄λ―Έμ§€μ˜ 곡간 해상도에 μ˜ν•΄ μ œν•œλ˜μ—ˆλ‹€. λ˜ν•œ μœ„μ„±μ˜μƒμ˜ μ‹œκ³΅κ°„ 해상도가 식생지도λ₯Ό ν†΅ν•œ κ΄‘ν•©μ„± λͺ¨λ‹ˆν„°λ§μ— λ―ΈμΉ˜λŠ” 영ν–₯은 μ™„μ „νžˆ λ°ν˜€μ§€μ§€ μ•Šμ•˜λ‹€. λ³Έ λ…Όλ¬Έμ—μ„œλŠ” 고해상도 식생 지도λ₯Ό μΌλ‹¨μœ„λ‘œ μƒμ„±ν•˜κΈ° μœ„μ„± μ˜μƒμ˜ μ‹œκ³΅κ°„ 해상도λ₯Ό ν–₯μƒμ‹œν‚€λŠ” 것을 λͺ©ν‘œλ‘œ ν•˜μ˜€λ‹€. 고해상도 μœ„μ„±μ˜μƒμ„ ν™œμš©ν•œ 식생 λ³€ν™” λͺ¨λ‹ˆν„°λ§μ„ μ‹œκ³΅κ°„μ μœΌλ‘œ ν™•μž₯ν•˜κΈ° μœ„ν•΄ 1) 정지ꢀ도 μœ„μ„±μ„ ν™œμš©ν•œ μ˜μƒμœ΅ν•©μ„ 톡해 μ‹œκ°„ν•΄μƒλ„ ν–₯상, 2) μ λŒ€μ μƒμ„±λ„€νŠΈμ›Œν¬λ₯Ό ν™œμš©ν•œ 곡간해상도 ν–₯상, 3) μ‹œκ³΅κ°„ν•΄μƒλ„κ°€ 높은 μœ„μ„±μ˜μƒμ„ 토지피볡이 κ· μ§ˆν•˜μ§€ μ•Šμ€ κ³΅κ°„μ—μ„œ 식물 κ΄‘ν•©μ„± λͺ¨λ‹ˆν„°λ§μ„ μˆ˜ν–‰ν•˜μ˜€λ‹€. 이처럼, μœ„μ„±κΈ°λ°˜ μ›κ²©νƒμ§€μ—μ„œ μƒˆλ‘œμš΄ 기술이 λ“±μž₯함에 따라 ν˜„μž¬ 및 과거의 μœ„μ„±μ˜μƒμ€ μ‹œκ³΅κ°„ 해상도 μΈ‘λ©΄μ—μ„œ ν–₯μƒλ˜μ–΄ 식생 λ³€ν™”μ˜ λͺ¨λ‹ˆν„°λ§ ν•  수 μžˆλ‹€. 제2μž₯μ—μ„œλŠ” μ •μ§€κΆ€λ„μœ„μ„±μ˜μƒμ„ ν™œμš©ν•˜λŠ” μ‹œκ³΅κ°„ μ˜μƒμœ΅ν•©μœΌλ‘œ μ‹λ¬Όμ˜ 광합성을 λͺ¨λ‹ˆν„°λ§ ν–ˆμ„ λ•Œ, μ‹œκ°„ν•΄μƒλ„κ°€ ν–₯상됨을 λ³΄μ˜€λ‹€. μ‹œκ³΅κ°„ μ˜μƒμœ΅ν•© μ‹œ, ꡬ름탐지, μ–‘λ°©ν–₯ λ°˜μ‚¬ ν•¨μˆ˜ μ‘°μ •, 곡간 등둝, μ‹œκ³΅κ°„ μœ΅ν•©, μ‹œκ³΅κ°„ 결츑치 보완 λ“±μ˜ 과정을 κ±°μΉœλ‹€. 이 μ˜μƒμœ΅ν•© μ‚°μΆœλ¬Όμ€ κ²½μž‘κ΄€λ¦¬ λ“±μœΌλ‘œ 식생 μ§€μˆ˜μ˜ μ—°κ°„ 변동이 큰 두 μž₯μ†Œ(농경지와 λ‚™μ—½μˆ˜λ¦Ό)μ—μ„œ ν‰κ°€ν•˜μ˜€λ‹€. κ·Έ κ²°κ³Ό, μ‹œκ³΅κ°„ μ˜μƒμœ΅ν•© μ‚°μΆœλ¬Όμ€ 결츑치 없이 ν˜„μž₯관츑을 μ˜ˆμΈ‘ν•˜μ˜€λ‹€ (R2 = 0.71, μƒλŒ€ 편ν–₯ = 5.64% 농경지; R2 = 0.79, μƒλŒ€ 편ν–₯ = -13.8%, ν™œμ—½μˆ˜λ¦Ό). μ‹œκ³΅κ°„ μ˜μƒμœ΅ν•©μ€ 식생 μ§€λ„μ˜ μ‹œκ³΅κ°„ 해상도λ₯Ό μ μ§„μ μœΌλ‘œ κ°œμ„ ν•˜μ—¬, 식물 생μž₯κΈ°λ™μ•ˆ μœ„μ„±μ˜μƒμ΄ ν˜„μž₯ 관츑을 κ³Όμ†Œ 평가λ₯Ό μ€„μ˜€λ‹€. μ˜μƒμœ΅ν•©μ€ 높은 μ‹œκ³΅κ°„ ν•΄μƒλ„λ‘œ κ΄‘ν•©μ„± 지도λ₯Ό μΌκ°„κ²©μœΌλ‘œ μƒμ„±ν•˜κΈ°μ— 이λ₯Ό ν™œμš©ν•˜μ—¬ μœ„μ„± μ˜μƒμ˜ μ œν•œλœ μ‹œκ³΅κ°„ ν•΄μƒλ„λ‘œ λ°ν˜€μ§€μ§€ μ•Šμ€ μ‹λ¬Όλ³€ν™”μ˜ 과정을 λ°œκ²¬ν•˜κΈΈ κΈ°λŒ€ν•œλ‹€. μ‹μƒμ˜ 곡간뢄포은 정밀농업과 토지 피볡 λ³€ν™” λͺ¨λ‹ˆν„°λ§μ„ μœ„ν•΄ ν•„μˆ˜μ μ΄λ‹€. 고해상도 μœ„μ„±μ˜μƒμœΌλ‘œ 지ꡬ ν‘œλ©΄μ„ κ΄€μΈ‘ν•˜λŠ” 것을 μš©μ΄ν•˜κ²Œ ν•΄μ‘Œλ‹€. 특히 Planet Fusion은 μ΄ˆμ†Œν˜•μœ„μ„±κ΅° 데이터λ₯Ό μ΅œλŒ€ν•œ ν™œμš©ν•΄ 데이터 결츑이 μ—†λŠ” 3m 곡간 ν•΄μƒλ„μ˜ μ§€ν‘œ ν‘œλ©΄ λ°˜μ‚¬λ„μ΄λ‹€. κ·ΈλŸ¬λ‚˜ κ³Όκ±° μœ„μ„± μ„Όμ„œ(Landsat의 경우 30~60m)의 곡간 ν•΄μƒλ„λŠ” μ‹μƒμ˜ 곡간적 λ³€ν™”λ₯Ό 상세 λΆ„μ„ν•˜λŠ” 것을 μ œν•œν–ˆλ‹€. 제3μž₯μ—μ„œλŠ” Landsat λ°μ΄ν„°μ˜ 곡간 해상도λ₯Ό ν–₯μƒν•˜κΈ° μœ„ν•΄ Planet Fusion 및 Landsat 8 데이터λ₯Ό μ‚¬μš©ν•˜μ—¬ 이쀑 μ λŒ€μ  생성 λ„€νŠΈμ›Œν¬(the dual RSS-GAN)λ₯Ό ν•™μŠ΅μ‹œμΌœ, 고해상도 μ •κ·œν™” 식생 μ§€μˆ˜(NDVI)와 식물 근적외선 λ°˜μ‚¬(NIRv)도λ₯Ό μƒμ„±ν•˜λŠ” ν•œλ‹€. νƒ€μ›ŒκΈ°λ°˜ ν˜„μž₯ μ‹μƒμ§€μˆ˜(μ΅œλŒ€ 8λ…„)와 λ“œλ‘ κΈ°λ°˜ μ΄ˆλΆ„κ΄‘μ§€λ„λ‘œ the dual RSS-GAN의 μ„±λŠ₯을 λŒ€ν•œλ―Όκ΅­ λ‚΄ 두 λŒ€μƒμ§€(농경지와 ν™œμ—½μˆ˜λ¦Ό)μ—μ„œ ν‰κ°€ν–ˆλ‹€. The dual RSS-GAN은 Landsat 8 μ˜μƒμ˜ 곡간해상도λ₯Ό ν–₯μƒμ‹œμΌœ 곡간 ν‘œν˜„μ„ λ³΄μ™„ν•˜κ³  식생 μ§€μˆ˜μ˜ κ³„μ ˆμ  λ³€ν™”λ₯Ό ν¬μ°©ν–ˆλ‹€(R2> 0.96). 그리고 the dual RSS-GAN은 Landsat 8 식생 μ§€μˆ˜κ°€ ν˜„μž₯에 λΉ„ν•΄ κ³Όμ†Œ ν‰κ°€λ˜λŠ” 것을 μ™„ν™”ν–ˆλ‹€. ν˜„μž₯ 관츑에 λΉ„ν•΄ 이쀑 RSS-GANκ³Ό Landsat 8의 μƒλŒ€ 편ν–₯ κ°’ 각각 -0.8% μ—μ„œ -1.5%, -10.3% μ—μ„œ -4.6% μ˜€λ‹€. μ΄λŸ¬ν•œ κ°œμ„ μ€ Planet Fusion의 곡간정보λ₯Ό 이쀑 RSS-GAN둜 ν•™μŠ΅ν•˜μ˜€κΈ°μ— κ°€λŠ₯ν–ˆλ‹€. ν—€λ‹Ή 연ꡬ κ²°κ³ΌλŠ” Landsat μ˜μƒμ˜ 곡간 해상도λ₯Ό ν–₯μƒμ‹œμΌœ μˆ¨κ²¨μ§„ 곡간 정보λ₯Ό μ œκ³΅ν•˜λŠ” μƒˆλ‘œμš΄ μ ‘κ·Ό 방식이닀. κ³ ν•΄μƒλ„μ—μ„œ 식물 κ΄‘ν•©μ„± μ§€λ„λŠ” 토지피볡이 λ³΅μž‘ν•œ κ³΅κ°„μ—μ„œ νƒ„μ†Œ μˆœν™˜ λͺ¨λ‹ˆν„°λ§μ‹œ ν•„μˆ˜μ μ΄λ‹€. κ·ΈλŸ¬λ‚˜ Sentinel-2, Landsat 및 MODIS와 같이 νƒœμ–‘ 동쑰 ꢀ도에 μžˆλŠ” μœ„μ„±μ€ 곡간 해상도가 λ†’κ±°λ‚˜ μ‹œκ°„ 해상도 높은 μœ„μ„±μ˜μƒλ§Œ μ œκ³΅ν•  수 μžˆλ‹€. 졜근 λ°œμ‚¬λœ μ΄ˆμ†Œν˜•μœ„μ„±κ΅°μ€ μ΄λŸ¬ν•œ 해상도 ν•œκ³„μ„ 극볡할 수 μžˆλ‹€. 특히 Planet Fusion은 μ΄ˆμ†Œν˜•μœ„μ„± 자료의 μ‹œκ³΅κ°„ ν•΄μƒλ„λ‘œ μ§€ν‘œλ©΄μ„ κ΄€μΈ‘ν•  수 μžˆλ‹€. 4μž₯μ—μ„œ, Planet Fusion μ§€ν‘œλ°˜μ‚¬λ„λ₯Ό μ΄μš©ν•˜μ—¬ μ‹μƒμ—μ„œ λ°˜μ‚¬λœ 근적외선 볡사(NIRvP)λ₯Ό 3m 해상도 지도λ₯Ό μΌκ°„κ²©μœΌλ‘œ μƒμ„±ν–ˆλ‹€. 그런 λ‹€μŒ λ―Έκ΅­ μΊ˜λ¦¬ν¬λ‹ˆμ•„μ£Ό μƒˆν¬λΌλ©˜ν† -μƒŒ ν˜Έμ•„ν‚¨ λΈνƒ€μ˜ ν”ŒλŸ­μŠ€ νƒ€μ›Œ λ„€νŠΈμ›Œν¬ 데이터와 λΉ„κ΅ν•˜μ—¬ 식물 광합성을 μΆ”μ •ν•˜κΈ° μœ„ν•œ NIRvP μ§€λ„μ˜ μ„±λŠ₯을 ν‰κ°€ν•˜μ˜€λ‹€. μ „μ²΄μ μœΌλ‘œ NIRvP μ§€λ„λŠ” μŠ΅μ§€μ˜ μž¦μ€ μˆ˜μœ„ 변화에도 λΆˆκ΅¬ν•˜κ³  κ°œλ³„ λŒ€μƒμ§€μ˜ 식물 κ΄‘ν•©μ„±μ˜ μ‹œκ°„μ  λ³€ν™”λ₯Ό ν¬μ°©ν•˜μ˜€λ‹€. κ·ΈλŸ¬λ‚˜ λŒ€μƒμ§€ 전체에 λŒ€ν•œ NIRvP 지도와 식물 κ΄‘ν•©μ„± μ‚¬μ΄μ˜ κ΄€κ³„λŠ” NIRvP 지도λ₯Ό ν”ŒλŸ­μŠ€ νƒ€μ›Œ κ΄€μΈ‘λ²”μœ„μ™€ μΌμΉ˜μ‹œν‚¬ λ•Œλ§Œ 높은 상관관계λ₯Ό λ³΄μ˜€λ‹€. κ΄€μΈ‘λ²”μœ„λ₯Ό μΌμΉ˜μ‹œν‚¬ 경우, NIRvP μ§€λ„λŠ” 식물 광합성을 μΆ”μ •ν•˜λŠ” 데 μžˆμ–΄ ν˜„μž₯ NIRvP보닀 μš°μˆ˜ν•œ μ„±λŠ₯을 λ³΄μ˜€λ‹€. μ΄λŸ¬ν•œ μ„±λŠ₯ μ°¨μ΄λŠ” ν”ŒλŸ­μŠ€ νƒ€μ›Œ κ΄€μΈ‘λ²”μœ„λ₯Ό μΌμΉ˜μ‹œν‚¬ λ•Œ, 연ꡬ λŒ€μƒμ§€ κ°„μ˜ NIRvP-식물 κ΄‘ν•©μ„± κ΄€κ³„μ˜ κΈ°μšΈκΈ°κ°€ 일관성을 λ³΄μ˜€κΈ° λ•Œλ¬Έμ΄λ‹€. λ³Έ 연ꡬ κ²°κ³ΌλŠ” μœ„μ„± 관츑을 ν”ŒλŸ­μŠ€ νƒ€μ›Œ κ΄€μΈ‘λ²”μœ„μ™€ μΌμΉ˜μ‹œν‚€λŠ” κ²ƒμ˜ μ€‘μš”μ„±μ„ 보여주고 높은 μ‹œκ³΅κ°„ ν•΄μƒλ„λ‘œ 식물 광합성을 μ›κ²©μœΌλ‘œ λͺ¨λ‹ˆν„°λ§ν•˜λŠ” μ΄ˆμ†Œν˜•μœ„μ„±κ΅° 자료의 잠재λ ₯을 보여쀀닀.Monitoring changes in terrestrial vegetation is essential to understanding interactions between atmosphere and biosphere, especially terrestrial ecosystem. To this end, satellite remote sensing offer maps for examining land surface in different scales. However, the detailed information was hindered under the clouds or limited by the spatial resolution of satellite imagery. Moreover, the impacts of spatial and temporal resolution in photosynthesis monitoring were not fully revealed. In this dissertation, I aimed to enhance the spatial and temporal resolution of satellite imagery towards daily gap-free vegetation maps with high spatial resolution. In order to expand vegetation change monitoring in time and space using high-resolution satellite images, I 1) improved temporal resolution of satellite dataset through image fusion using geostationary satellites, 2) improved spatial resolution of satellite dataset using generative adversarial networks, and 3) showed the use of high spatiotemporal resolution maps for monitoring plant photosynthesis especially over heterogeneous landscapes. With the advent of new techniques in satellite remote sensing, current and past datasets can be fully utilized for monitoring vegetation changes in the respect of spatial and temporal resolution. In Chapter 2, I developed the integrated system that implemented geostationary satellite products in the spatiotemporal image fusion method for monitoring canopy photosynthesis. The integrated system contains the series of process (i.e., cloud masking, nadir bidirectional reflectance function adjustment, spatial registration, spatiotemporal image fusion, spatial gap-filling, temporal-gap-filling). I conducted the evaluation of the integrated system over heterogeneous rice paddy landscape where the drastic land cover changes were caused by cultivation management and deciduous forest where consecutive changes occurred in time. The results showed that the integrated system well predict in situ measurements without data gaps (R2 = 0.71, relative bias = 5.64% at rice paddy site; R2 = 0.79, relative bias = -13.8% at deciduous forest site). The integrated system gradually improved the spatiotemporal resolution of vegetation maps, reducing the underestimation of in situ measurements, especially during peak growing season. Since the integrated system generates daily canopy photosynthesis maps for monitoring dynamics among regions of interest worldwide with high spatial resolution. I anticipate future efforts to reveal the hindered information by the limited spatial and temporal resolution of satellite imagery. Detailed spatial representations of terrestrial vegetation are essential for precision agricultural applications and the monitoring of land cover changes in heterogeneous landscapes. The advent of satellite-based remote sensing has facilitated daily observations of the Earths surface with high spatial resolution. In particular, a data fusion product such as Planet Fusion has realized the delivery of daily, gap-free surface reflectance data with 3-m pixel resolution through full utilization of relatively recent (i.e., 2018-) CubeSat constellation data. However, the spatial resolution of past satellite sensors (i.e., 30–60 m for Landsat) has restricted the detailed spatial analysis of past changes in vegetation. In Chapter 3, to overcome the spatial resolution constraint of Landsat data for long-term vegetation monitoring, we propose a dual remote-sensing super-resolution generative adversarial network (dual RSS-GAN) combining Planet Fusion and Landsat 8 data to simulate spatially enhanced long-term time-series of the normalized difference vegetation index (NDVI) and near-infrared reflectance from vegetation (NIRv). We evaluated the performance of the dual RSS-GAN against in situ tower-based continuous measurements (up to 8 years) and remotely piloted aerial system-based maps of cropland and deciduous forest in the Republic of Korea. The dual RSS-GAN enhanced spatial representations in Landsat 8 images and captured seasonal variation in vegetation indices (R2 > 0.95, for the dual RSS-GAN maps vs. in situ data from all sites). Overall, the dual RSS-GAN reduced Landsat 8 vegetation index underestimations compared with in situ measurements; relative bias values of NDVI ranged from βˆ’3.2% to 1.2% and βˆ’12.4% to βˆ’3.7% for the dual RSS-GAN and Landsat 8, respectively. This improvement was caused by spatial enhancement through the dual RSS-GAN, which captured fine-scale information from Planet Fusion. This study presents a new approach for the restoration of hidden sub-pixel spatial information in Landsat images. Mapping canopy photosynthesis in both high spatial and temporal resolution is essential for carbon cycle monitoring in heterogeneous areas. However, well established satellites in sun-synchronous orbits such as Sentinel-2, Landsat and MODIS can only provide either high spatial or high temporal resolution but not both. Recently established CubeSat satellite constellations have created an opportunity to overcome this resolution trade-off. In particular, Planet Fusion allows full utilization of the CubeSat data resolution and coverage while maintaining high radiometric quality. In Chapter 4, I used the Planet Fusion surface reflectance product to calculate daily, 3-m resolution, gap-free maps of the near-infrared radiation reflected from vegetation (NIRvP). I then evaluated the performance of these NIRvP maps for estimating canopy photosynthesis by comparing with data from a flux tower network in Sacramento-San Joaquin Delta, California, USA. Overall, NIRvP maps captured temporal variations in canopy photosynthesis of individual sites, despite changes in water extent in the wetlands and frequent mowing in the crop fields. When combining data from all sites, however, I found that robust agreement between NIRvP maps and canopy photosynthesis could only be achieved when matching NIRvP maps to the flux tower footprints. In this case of matched footprints, NIRvP maps showed considerably better performance than in situ NIRvP in estimating canopy photosynthesis both for daily sum and data around the time of satellite overpass (R2 = 0.78 vs. 0.60, for maps vs. in situ for the satellite overpass time case). This difference in performance was mostly due to the higher degree of consistency in slopes of NIRvP-canopy photosynthesis relationships across the study sites for flux tower footprint-matched maps. Our results show the importance of matching satellite observations to the flux tower footprint and demonstrate the potential of CubeSat constellation imagery to monitor canopy photosynthesis remotely at high spatio-temporal resolution.Chapter 1. Introduction 2 1. Background 2 1.1 Daily gap-free surface reflectance using geostationary satellite products 2 1.2 Monitoring past vegetation changes with high-spatial-resolution 3 1.3 High spatiotemporal resolution vegetation photosynthesis maps 4 2. Purpose of Research 4 Chapter 2. Generating daily gap-filled BRDF adjusted surface reflectance product at 10 m resolution using geostationary satellite product for monitoring daily canopy photosynthesis 6 1. Introduction 6 2. Methods 11 2.1 Study sites 11 2.2 In situ measurements 13 2.3 Satellite products 14 2.4 Integrated system 17 2.5 Canopy photosynthesis 21 2.6 Evaluation 23 3. Results and discussion 24 3.1 Comparison of STIF NDVI and NIRv with in situ NDVI and NIRv 24 3.2 Comparison of STIF NIRvP with in situ NIRvP 28 4. Conclusion 31 Chapter 3. Super-resolution of historic Landsat imagery using a dual Generative Adversarial Network (GAN) model with CubeSat constellation imagery for monitoring vegetation changes 32 1. Introduction 32 2. Methods 38 2.1 Real-ESRGAN model 38 2.2 Study sites 40 2.3 In situ measurements 42 2.4 Vegetation index 44 2.5 Satellite data 45 2.6 Planet Fusion 48 2.7 Dual RSS-GAN via fine-tuned Real-ESRGAN 49 2.8 Evaluation 54 3. Results 57 3.1 Comparison of NDVI and NIRv maps from Planet Fusion, Sentinel 2 NBAR, and Landsat 8 NBAR data with in situ NDVI and NIRv 57 3.2 Comparison of dual RSS-SRGAN model results with Landsat 8 NDVI and NIRv 60 3.3 Comparison of dual RSS-GAN model results with respect to in situ time-series NDVI and NIRv 63 3.4 Comparison of the dual RSS-GAN model with NDVI and NIRv maps derived from RPAS 66 4. Discussion 70 4.1 Monitoring changes in terrestrial vegetation using the dual RSS-GAN model 70 4.2 CubeSat data in the dual RSS-GAN model 72 4.3 Perspectives and limitations 73 5. Conclusion 78 Appendices 79 Supplementary material 82 Chapter 4. Matching high resolution satellite data and flux tower footprints improves their agreement in photosynthesis estimates 85 1. Introduction 85 2. Methods 89 2.1 Study sites 89 2.2 In situ measurements 92 2.3 Planet Fusion NIRvP 94 2.4 Flux footprint model 98 2.5 Evaluation 98 3. Results 105 3.1 Comparison of Planet Fusion NIRv and NIRvP with in situ NIRv and NIRvP 105 3.2 Comparison of instantaneous Planet Fusion NIRv and NIRvP with against tower GPP estimates 108 3.3 Daily GPP estimation from Planet Fusion -derived NIRvP 114 4. Discussion 118 4.1 Flux tower footprint matching and effects of spatial and temporal resolution on GPP estimation 118 4.2 Roles of radiation component in GPP mapping 123 4.3 Limitations and perspectives 126 5. Conclusion 133 Appendix 135 Supplementary Materials 144 Chapter 5. Conclusion 153 Bibliography 155 Abstract in Korea 199 Acknowledgements 202λ°•

    Deep Learning based data-fusion methods for remote sensing applications

    Get PDF
    In the last years, an increasing number of remote sensing sensors have been launched to orbit around the Earth, with a continuously growing production of massive data, that are useful for a large number of monitoring applications, especially for the monitoring task. Despite modern optical sensors provide rich spectral information about Earth's surface, at very high resolution, they are weather-sensitive. On the other hand, SAR images are always available also in presence of clouds and are almost weather-insensitive, as well as daynight available, but they do not provide a rich spectral information and are severely affected by speckle "noise" that make difficult the information extraction. For the above reasons it is worth and challenging to fuse data provided by different sources and/or acquired at different times, in order to leverage on their diversity and complementarity to retrieve the target information. Motivated by the success of the employment of Deep Learning methods in many image processing tasks, in this thesis it has been faced different typical remote sensing data-fusion problems by means of suitably designed Convolutional Neural Networks
    • …
    corecore