72 research outputs found

    Insights into Titan’s geology and hydrology based on enhanced image processing of Cassini RADAR data

    Get PDF
    The Cassini Synthetic Aperture Radar has been acquiring images of Titan's surface since October 2004. To date, 59% of Titan's surface has been imaged by radar, with significant regions imaged more than once. Radar data suffer from speckle noise hindering interpretation of small-scale features and comparison of reimaged regions for change detection. We present here a new image analysis technique that combines a denoising algorithm with mapping and quantitative measurements that greatly enhance the utility of the data and offers previously unattainable insights. After validating the technique, we demonstrate the potential improvement in understanding of surface processes on Titan and defining global mapping units, focusing on specific landforms including lakes, dunes, mountains, and fluvial features. Lake shorelines are delineated with greater accuracy. Previously unrecognized dissection by fluvial channels emerges beneath shallow methane cover. Dune wavelengths and interdune extents are more precisely measured. A significant refinement in producing digital elevation models is shown. Interactions of fluvial and aeolian processes with topographic relief is more precisely observed and understood than previously. Benches in bathymetry are observed in northern sea Ligeia Mare. Submerged valleys show similar depth suggesting that they are equilibrated with marine benches. These new observations suggest a liquid level increase in the northern sea, which may be due to changes on seasonal or longer timescales

    Low Complexity Regularization of Linear Inverse Problems

    Full text link
    Inverse problems and regularization theory is a central theme in contemporary signal processing, where the goal is to reconstruct an unknown signal from partial indirect, and possibly noisy, measurements of it. A now standard method for recovering the unknown signal is to solve a convex optimization problem that enforces some prior knowledge about its structure. This has proved efficient in many problems routinely encountered in imaging sciences, statistics and machine learning. This chapter delivers a review of recent advances in the field where the regularization prior promotes solutions conforming to some notion of simplicity/low-complexity. These priors encompass as popular examples sparsity and group sparsity (to capture the compressibility of natural signals and images), total variation and analysis sparsity (to promote piecewise regularity), and low-rank (as natural extension of sparsity to matrix-valued data). Our aim is to provide a unified treatment of all these regularizations under a single umbrella, namely the theory of partial smoothness. This framework is very general and accommodates all low-complexity regularizers just mentioned, as well as many others. Partial smoothness turns out to be the canonical way to encode low-dimensional models that can be linear spaces or more general smooth manifolds. This review is intended to serve as a one stop shop toward the understanding of the theoretical properties of the so-regularized solutions. It covers a large spectrum including: (i) recovery guarantees and stability to noise, both in terms of â„“2\ell^2-stability and model (manifold) identification; (ii) sensitivity analysis to perturbations of the parameters involved (in particular the observations), with applications to unbiased risk estimation ; (iii) convergence properties of the forward-backward proximal splitting scheme, that is particularly well suited to solve the corresponding large-scale regularized optimization problem

    Automatic Nonlinear Filtering and Segmentation for Breast Ultrasound Images

    Get PDF
    Breast cancer is one of the leading causes of cancer death among women worldwide. The proposed approach comprises three steps as follows. Firstly, the image is preprocessed to remove speckle noise while preserving important features of the image. Three methods are investigated, i.e., Frost Filter, Detail Preserving Anisotropic Diffusion, and Probabilistic Patch-Based Filter. Secondly, Normalized Cut or Quick Shift is used to provide an initial segmentation map for breast lesions. Thirdly, a postprocessing step is proposed to select the correct region from a set of candidate regions. This approach is implemented on a dataset containing 20 B-mode ultrasound images, acquired from UDIAT Diagnostic Center of Sabadell, Spain. The overall system performance is determined against the ground truth images. The best system performance is achieved through the following combinations: Frost Filter with Quick Shift, Detail Preserving Anisotropic Diffusion with Normalized Cut and Probabilistic Patch-Based with Normalized Cut

    Tail state limited photocurrent collection of thick photoactive layers in organic solar cells

    Get PDF
    We analyse organic solar cells with four different photoactive blends exhibiting differing dependencies of short-circuit current upon photoactive layer thickness. These blends and devices are analysed by transient optoelectronic techniques of carrier kinetics and densities, air photoemission spectroscopy of material energetics, Kelvin probe measurements of work function, Mott-Schottky analyses of apparent doping density and by device modelling. We conclude that, for the device series studied, the photocurrent loss with thick active layers is primarily associated with the accumulation of photo-generated charge carriers in intra-bandgap tail states. This charge accumulation screens the device internal electrical field, preventing efficient charge collection. Purification of one studied donor polymer is observed to reduce tail state distribution and density and increase the maximal photoactive thickness for efficient operation. Our work suggests that selecting organic photoactive layers with a narrow distribution of tail states is a key requirement for the fabrication of efficient, high photocurrent, thick organic solar cells

    Proximal Splitting Derivatives for Risk Estimation

    No full text
    This paper develops a novel framework to compute a projected Generalized Stein Unbiased Risk Estimator (GSURE) for a wide class of sparsely regularized solutions of inverse problems. This class includes arbitrary convex data fidelities with both analysis and synthesis mixed L1-L2 norms. The GSURE necessitates to compute the (weak) derivative of a solution w.r.t.~the observations. However, as the solution is not available in analytical form but rather through iterative schemes such as proximal splitting, we propose to iteratively compute the GSURE by differentiating the sequence of iterates. This provides us with a sequence of differential mappings, which, hopefully, converge to the desired derivative and allows to compute the GSURE. We illustrate this approach on total variation regularization with Gaussian noise and to sparse regularization with poisson noise, to automatically select the regularization parameter.Adaptivité pour la représentation des images naturelles et des texturesERC SIGMA-Visio

    The exploitation of the non local paradigm for sar 3d reconstruction

    No full text
    In the last decades, several approaches for solving the Phase Unwrapping (PhU) problem using multi-channel Interferometric Synthetic Aperture Radar (InSAR) data have been developed. Many of the proposed approaches are based on statistical estimation theory, both classical and Bayesian. In particular, the statistical approaches based on the use of the whole complex multi-channel dataset have turned to be effective. The latter are based on the exploitation of the covariance matrix, which contains the parameters of interest. In this paper, the added value of the Non Local (NL) paradigm within the InSAR multi-channel PhU framework is investigated. The analysis of the impact of NL technique is performed using multi-channel realistic simulated data and X-band data
    • …
    corecore