30 research outputs found

    Deep learning for inverse problems in remote sensing: super-resolution and SAR despeckling

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen

    Synthetic Aperture Radar (SAR) Meets Deep Learning

    Get PDF
    This reprint focuses on the application of the combination of synthetic aperture radars and depth learning technology. It aims to further promote the development of SAR image intelligent interpretation technology. A synthetic aperture radar (SAR) is an important active microwave imaging sensor, whose all-day and all-weather working capacity give it an important place in the remote sensing community. Since the United States launched the first SAR satellite, SAR has received much attention in the remote sensing community, e.g., in geological exploration, topographic mapping, disaster forecast, and traffic monitoring. It is valuable and meaningful, therefore, to study SAR-based remote sensing applications. In recent years, deep learning represented by convolution neural networks has promoted significant progress in the computer vision community, e.g., in face recognition, the driverless field and Internet of things (IoT). Deep learning can enable computational models with multiple processing layers to learn data representations with multiple-level abstractions. This can greatly improve the performance of various applications. This reprint provides a platform for researchers to handle the above significant challenges and present their innovative and cutting-edge research results when applying deep learning to SAR in various manuscript types, e.g., articles, letters, reviews and technical reports

    Speckle reduction in SAR imagery

    Get PDF
    Synthetic Aperture Radar (SAR) is a popular tool for airborne and space-borne remote sensing. Inherent to SAR imagery is a type of multiplicative noise known as speckle. There are a number of different approaches which may be taken in order to reduce the amount of speckle noise in SAR imagery. One of the approaches is termed post image formation processing and this is the main concern of this thesis. Background theory relevant to the speckle reduction problem is presented. The physical processes which lead to the formation of speckle are investigated in order to understand the nature of speckle noise. Various statistical properties of speckle noise in different types of SAR images are presented. These include Probability Distribution Functions as well as means and standard deviations. Speckle is considered as a multiplicative noise and a general model is discussed. The last section of this chapter deals with the various approaches to speckle reduction. Chapter three contains a review of the literature pertaining to speckle reduction. Multiple look methods are covered briefly and then the various classes of post image formation processing are reviewed. A number of non-adaptive, adaptive and segmentation-based techniques are reviewed. Other classes of technique which are reviewed include Morphological filtering, Homomorphic processing and Transform domain methods. From this review, insights can be gained as to the advantages and disadvantages of various methods. A number of filtering algorithms which are either promising, or are representative of a class of techniques, are chosen for implementation and analysis

    Independent component analysis (ICA) applied to ultrasound image processing and tissue characterization

    Get PDF
    As a complicated ubiquitous phenomenon encountered in ultrasound imaging, speckle can be treated as either annoying noise that needs to be reduced or the source from which diagnostic information can be extracted to reveal the underlying properties of tissue. In this study, the application of Independent Component Analysis (ICA), a relatively new statistical signal processing tool appeared in recent years, to both the speckle texture analysis and despeckling problems of B-mode ultrasound images was investigated. It is believed that higher order statistics may provide extra information about the speckle texture beyond the information provided by first and second order statistics only. However, the higher order statistics of speckle texture is still not clearly understood and very difficult to model analytically. Any direct dealing with high order statistics is computationally forbidding. On the one hand, many conventional ultrasound speckle texture analysis algorithms use only first or second order statistics. On the other hand, many multichannel filtering approaches use pre-defined analytical filters which are not adaptive to the data. In this study, an ICA-based multichannel filtering texture analysis algorithm, which considers both higher order statistics and data adaptation, was proposed and tested on the numerically simulated homogeneous speckle textures. The ICA filters were learned directly from the training images. Histogram regularization was conducted to make the speckle images quasi-stationary in the wide sense so as to be adaptive to an ICA algorithm. Both Principal Component Analysis (PCA) and a greedy algorithm were used to reduce the dimension of feature space. Finally, Support Vector Machines (SVM) with Radial Basis Function (RBF) kernel were chosen as the classifier for achieving best classification accuracy. Several representative conventional methods, including both low and high order statistics based methods, and both filtering and non-filtering methods, have been chosen for comparison study. The numerical experiments have shown that the proposed ICA-based algorithm in many cases outperforms other algorithms for comparison. Two-component texture segmentation experiments were conducted and the proposed algorithm showed strong capability of segmenting two visually very similar yet different texture regions with rather fuzzy boundaries and almost the same mean and variance. Through simulating speckle with first order statistics approaching gradually to the Rayleigh model from different non-Rayleigh models, the experiments to some extent reveal how the behavior of higher order statistics changes with the underlying property of tissues. It has been demonstrated that when the speckle approaches the Rayleigh model, both the second and higher order statistics lose the texture differentiation capability. However, when the speckles tend to some non-Rayleigh models, methods based on higher order statistics show strong advantage over those solely based on first or second order statistics. The proposed algorithm may potentially find clinical application in the early detection of soft tissue disease, and also be helpful for better understanding ultrasound speckle phenomenon in the perspective of higher order statistics. For the despeckling problem, an algorithm was proposed which adapted the ICA Sparse Code Shrinkage (ICA-SCS) method for the ultrasound B-mode image despeckling problem by applying an appropriate preprocessing step proposed by other researchers. The preprocessing step makes the speckle noise much closer to the real white Gaussian noise (WGN) hence more amenable to a denoising algorithm such as ICS-SCS that has been strictly designed for additive WGN. A discussion is given on how to obtain the noise-free training image samples in various ways. The experimental results have shown that the proposed method outperforms several classical methods chosen for comparison, including first or second order statistics based methods (such as Wiener filter) and multichannel filtering methods (such as wavelet shrinkage), in the capability of both speckle reduction and edge preservation

    Improved Non-Local Means Algorithm Based on Dimensionality Reduction

    Get PDF
    Non-Local Means is an image denoising algorithm based on patch similarity. It compares a reference patch with the neighboring patches to find similar patches. Such similar patches participate in the weighted averaging process. Most of the computational time for Non-Local Means is consumed to measure patch similarity. In this thesis, we have proposed an improvement where the image patches are projected into a global feature space. Then we have performed a statistical t-test to reduce the dimensionality of this feature space. Denoising is achieved based on this reduced feature space and the proposed modification exploits an improvement in terms of denoising performance and computational time

    Ultrasound image processing in the evaluation of labor induction failure risk

    Get PDF
    Labor induction is defined as the artificial stimulation of uterine contractions for the purpose of vaginal birth. Induction is prescribed for medical and elective reasons. Success in labor induction procedures is related to vaginal delivery. Cesarean section is one of the potential risks of labor induction as it occurs in about 20% of the inductions. A ripe cervix (soft and distensible) is needed for a successful labor. During the ripening cervical, tissues experience micro structural changes: collagen becomes disorganized and water content increases. These changes will affect the interaction between cervical tissues and sound waves during ultrasound transvaginal scanning and will be perceived as gray level intensity variations in the echographic image. Texture analysis can be used to analyze these variations and provide a means to evaluate cervical ripening in a non-invasive way

    Information Extraction and Modeling from Remote Sensing Images: Application to the Enhancement of Digital Elevation Models

    Get PDF
    To deal with high complexity data such as remote sensing images presenting metric resolution over large areas, an innovative, fast and robust image processing system is presented. The modeling of increasing level of information is used to extract, represent and link image features to semantic content. The potential of the proposed techniques is demonstrated with an application to enhance and regularize digital elevation models based on information collected from RS images

    Quantitative Estimation of Surface Soil Moisture in Agricultural Landscapes using Spaceborne Synthetic Aperture Radar Imaging at Different Frequencies and Polarizations

    Get PDF
    Soil moisture and its distribution in space and time plays an important role in the surface energy balance at the soil-atmosphere interface. It is a key variable influencing the partitioning of solar energy into latent and sensible heat flux as well as the partitioning of precipitation into runoff and percolation. Due to their large spatial variability, estimation of spatial patterns of soil moisture from field measurements is difficult and not feasible for large scale analyses. In the past decades, Synthetic Aperture Radar (SAR) remote sensing has proven its potential to quantitatively estimate near surface soil moisture at high spatial resolutions. Since the knowledge of the basic SAR concepts is important to understand the impact of different natural terrain features on the quantitative estimation of soil moisture and other surface parameters, the fundamental principles of synthetic aperture radar imaging are discussed. Also the two spaceborne SAR missions whose data was used in this study, the ENVISAT of the European Space Agency (ESA) and the ALOS of the Japanese Aerospace Exploration Agency (JAXA), are introduced. Subsequently, the two essential surface properties in the field of radar remote sensing, surface soil moisture and surface roughness are defined, and the established methods of their measurement are described. The in situ data used in this study, as well as the research area, the River Rur catchment, with the individual test sites where the data was collected between 2007 and 2010, are specified. On this basis, the important scattering theories in radar polarimetry are discussed and their application is demonstrated using novel polarimetric ALOS/PALSAR data. A critical review of different classical approaches to invert soil moisture from SAR imaging is provided. Five prevalent models have been chosen with the aim to provide an overview of the evolution of ideas and techniques in the field of soil moisture estimation from active microwave data. As the core of this work, a new semi-empirical model for the inversion of surface soil moisture from dual polarimetric L-band SAR data is introduced. This novel approach utilizes advanced polarimetric decomposition techniques to correct for the disturbing effects from surface roughness and vegetation on the soil moisture retrieval without the use of a priori knowledge. The land use specific algorithms for bare soil, grassland, sugar beet, and winter wheat allow quantitative estimations with accuracies in the order of 4 Vol.-%. Application of remotely sensed soil moisture patterns is demonstrated on the basis of mesoscale SAR data by investigating the variability of soil moisture patterns at different spatial scales ranging from field scale to catchment scale. The results show that the variability of surface soil moisture decreases with increasing wetness states at all scales. Finally, the conclusions from this dissertational research are summarized and future perspectives on how to extend the proposed model by means of improved ground based measurements and upcoming advances in sensor technology are discussed. The results obtained in this thesis lead to the conclusion that state-of-the-art spaceborne dual polarimetric L-band SAR systems are not only suitable to accurately retrieve surface soil moisture contents of bare as well as of vegetated agricultural fields and grassland, but for the first time also allow investigating within-field spatial heterogeneities from space

    An Iterative Wavelet Threshold for Signal Denoising

    Full text link
    This paper introduces an adaptive filtering process based on shrinking wavelet coefficients from the corresponding signal wavelet representation. The filtering procedure considers a threshold method determined by an iterative algorithm inspired by the control charts application, which is a tool of the statistical process control (SPC). The proposed method, called SpcShrink, is able to discriminate wavelet coefficients that significantly represent the signal of interest. The SpcShrink is algorithmically presented and numerically evaluated according to Monte Carlo simulations. Two empirical applications to real biomedical data filtering are also included and discussed. The SpcShrink shows superior performance when compared with competing algorithms.Comment: 19 pages, 10 figures, 2 table
    corecore