103 research outputs found
Deep Learning Solutions for TanDEM-X-based Forest Classification
In the last few years, deep learning (DL) has been successfully and massively
employed in computer vision for discriminative tasks, such as image
classification or object detection. This kind of problems are core to many
remote sensing (RS) applications as well, though with domain-specific
peculiarities. Therefore, there is a growing interest on the use of DL methods
for RS tasks. Here, we consider the forest/non-forest classification problem
with TanDEM-X data, and test two state-of-the-art DL models, suitably adapting
them to the specific task. Our experiments confirm the great potential of DL
methods for RS applications
Mapping a Brazilian deforestation frontier using multi-temporal TerraSAR-X data and supervised machine learning
Satellite remote sensing enables a repeated survey of the earth’s surface. With machine
learning it is possible to recognize complex patterns from extensive data sets. Using methods
from machine learning, remote sensing images are utilized to derive large scale land use
and land cover (LULC) maps, carrying discrete information on the human management of
land and intact primary forests, as well as change processes. Such information is particularly
relevant in little developed regions, and areas which are undergoing transformation. Therefore,
satellite remote sensing is generally the preferred method for generating LULC products
within tropical regions, and particularly useful to assist tracking of change processes with
regard to deforestation or land management. The Amazon is the largest area of continuous
tropical forest in the world, and of substantial importance with regard to biodiversity, its
influence on global climate, as well as providing living space for a large number of indigenous
tribes. As tropical region, the Amazon is particularly affected by cloudy conditions, which
pose a serious challenge to many remote sensing efforts. Utilization of Synthetic Aperture
Radar (SAR) hence is promoted, as this warrants data availability at fixed intervals.
Performing land cover mapping at the deforestation frontier in the Brazilian states of Pará
and Mato Grosso, the aim of this thesis is to evaluate latest concepts from machine learning
and SAR remote sensing in the light of real world applicability. As a cumulative effort, this
thesis provides a scalable method based on Markov Random Fields, to increase classification
performance. This method is especially useful to enhance the outcome of SAR classifications,
as it directly addresses inherent SAR properties such as multi-temporality and speckle.
Furthermore, ALOS-2, RADARSAT-2, and TerraSAR-X, which are current SAR sensors
fulfilling different properties with regard to ground resolution and wavelength, are being
investigated concerning their synergetic potentials for the mapping of vegetated LULC classes
of the Brazilian Amazon. Here, the additional value of combining multiple frequencies is
evaluated using reliable validation techniques based on area adjustment. Additionally, single
performance of the three sensors is evaluated and their potentials concerning the task of
tropical mapping are estimated. Lastly, different potentials of TanDEM-X for the purpose of
tropical mapping are investigated. TanDEM-X is the first continuous spaceborne missionvi
to offer a bi-static acquisition of data, enabling the generation of height models and the
collection of coherence layers via a single pass
Speckle Reduction of SAR Images Based on a Combined Markov Random Field and Statistical Optics Approach (Version1)
One of the major factors plaguing the performance of synthetic aperture radar (SAR) imagery is the presence of signal-dependent, speckle noise. Grainy in appearance, speckle noise is primarily due to the phase fluctuations of the electromagnetic returned signals. Since the inherent spatial-correlation characteristics of speckle in SAR images are not exploited in existing multiplicative models for speckle noise, a new approach is proposed here that provides a new mathematical framework for modeling and mitigation of speckle noise. The contribution of this paper is twofold. First, a novel model for speckled SAR imaging is introduced based on Markov random fields (MRFs) in conjunction with statistical optics. Second, utilizing the model, a global energy-minimization algorithm based on simulated annealing (SA), is introduced for speckle reduction. In particular, the joint conditional probability density function (cpdf) of the intensity of any two points in the speckled image and the associated correlation function are used to derive the cpdf of any center pixel intensity given its four neighbors. The Hammersley-Clifford theorem is then used to derive the energy function associated with the MRF. The SA, built on the Metropolis sampler, is employed for speckle reduction. Four metrics are used to assess the quality of the speckle reduction: the mean-square error, SNR, an edge-preservation parameter and the equivalent number of looks. A comparative study using both simulations and real SAR images indicates that the proposed approach performs better in comparison to filtering techniques such as the Gamm Map, the modified Lee and the enhanced Frost algorithms
Joint filtering of SAR amplitude and interferometric phase with graph-cuts
Like other coherent imaging modalities, synthetic aperture radar (SAR) images suffer from speckle noise. The presence
of this noise makes the automatic interpretation of images a challenging task and noise reduction is often a prerequisite
for successful use of classical image processing algorithms. values respectively less (sub-figure 1, under-regularized), equal (sub-figure 2) or greater (sub figure 3, over-regularized)
than βopt.
Section IV-B presents some results of the joint regularization of high-resolution interferometric SAR images on two
datasets: a 1200 Ă— 1200 pixels region of interest from Toulouse city, France (figure 5), and a 1024 Ă— 682 pixels
region of interest from Saint-Paul sur Mer, France (figure 7).
From the regularized images shown, it can be noticed that the noise has been efficiently reduced both in amplitude and
phase images. The sharp transitions in the phase image that correspond to man-made structures are well preserved.
Joint regularization gives more precise contours than independent regularization as they are co-located from the phase
and amplitude images. Small objects also tend to be better preserved by joint-regularization as illustrated in figure 6
which shows an excerpt of a portion of streets with several aligned streetlights visible as brighter dots (higher reflectivity
as well as higher altitude).
values respectively less (sub-figure 1, under-regularized), equal (sub-figure 2) or greater (sub figure 3, over-regularized)
than βopt.
Section IV-B presents some results of the joint regularization of high-resolution interferometric SAR images on two
datasets: a 1200 Ă— 1200 pixels region of interest from Toulouse city, France (figure 5), and a 1024 Ă— 682 pixels
region of interest from Saint-Paul sur Mer, France (figure 7).
From the regularized images shown, it can be noticed that the noise has been efficiently reduced both in amplitude and
phase images. The sharp transitions in the phase image that correspond to man-made structures are well preserved.
Joint regularization gives more precise contours than independent regularization as they are co-located from the phase
and amplitude images. Small objects also tend to be better preserved by joint-regularization as illustrated in figure 6
which shows an excerpt of a portion of streets with several aligned streetlights visible as brighter dots (higher reflectivity
as well as higher altitude).L’imagerie radar à ouverture synthétique (SAR), comme d’autres modalités d’imagerie cohérente, souffre de la
présence du chatoiement (speckle). Cette perturbation rend difficile l’interprétation automatique des images et
le filtrage est souvent une étape nécessaire à l’utilisation d’algorithmes de traitement d’images classiques.
De nombreuses approches ont été proposées pour filtrer les images corrompues par un bruit de chatoiement.
La modélisation par champs de Markov (CdM) fournit un cadre adapté pour exprimer à la fois les contraintes
sur l’attache aux données et les propriétés désirées sur l’image filtrée. Dans ce contexte la minimisation de la
variation totale a été abondamment utilisée afin de limiter les oscillations dans l’image régularisée tout en
préservant les bords.
Le bruit de chatoiement suit une distribution de probabilité à queue lourde et la formulation par CdM conduit
à un problème de minimisation mettant en jeu des attaches aux données non-convexes. Une telle
minimisation peut être obtenue par une approche d’optimisation combinatoire en calculant des
coupures minimales de graphes. Bien que cette optimisation puisse être menée en théorie, ce type
d’approche ne peut être appliqué en pratique sur les images de grande taille rencontrées dans les
applications de télédétection à cause de leur grande consommation de mémoire. Le temps de calcul des
algorithmes de minimisation approchée (en particulier α-extension) est généralement trop élevé quand la
régularisation jointe de plusieurs images est considérée.
Nous montrons qu’une solution satisfaisante peut être obtenue, en quelques itérations, en menant une
exploration de l’espace de recherche avec de grands pas. Cette dernière est réalisée en utilisant des
techniques de coupures minimales. Cet algorithme est appliqué pour régulariser de manière jointe à la fois
l’amplitude et la phase interférométrique d’images SAR en milieu urbain
Improved Goldstein Interferogram Filter Based on Local Fringe Frequency Estimation
The quality of an interferogram, which is limited by various phase noise, will greatly affect the further processes of InSAR, such as phase unwrapping. Interferometric SAR (InSAR) geophysical measurements’, such as height or displacement, phase filtering is therefore an essential step. In this work, an improved Goldstein interferogram filter is proposed to suppress the phase noise while preserving the fringe edges. First, the proposed adaptive filter step, performed before frequency estimation, is employed to improve the estimation accuracy. Subsequently, to preserve the fringe characteristics, the estimated fringe frequency in each fixed filtering patch is removed from the original noisy phase. Then, the residual phase is smoothed based on the modified Goldstein filter with its parameter alpha dependent on both the coherence map and the residual phase frequency. Finally, the filtered residual phase and the removed fringe frequency are combined to generate the filtered interferogram, with the loss of signal minimized while reducing the noise level. The effectiveness of the proposed method is verified by experimental results based on both simulated and real data
Global approaches and local strategies for phase unwrapping
Phase unwrapping, i.e. the retrieval of absolute phases from wrapped, noisy measures, is a tough problem because of the presence of rotational inconsistencies (residues), randomly generated by noise and undersampling on the principal phase gradient field. These inconsistencies prevent the recovery of the absolute phase field by direct integration of the wrapped gradients. In this paper we examine the relative merit of known global approaches and then we present evidence that our approach based on “stochastic annealing” can recover the true phase field also in noisy areas with severe undersampling, where other methods fail. Then, some experiments with local approaches are presented. A fast neural filter has been trained to eliminate close residue couples by joining them in a way which takes into account the local phase information. Performances are about 60–70% of the residues. Finally,
other experiments have been aimed at designing an automated method for the determination of weight matrices to use in conjunction with local phase unwrapping algorithms. The method, tested with the minimum cost flow algorithm, gives good performances over both simulated and real data
- …