59 research outputs found

    Poisson Noise Removal in Spherical Multichannel Images: Application to Fermi data

    Get PDF
    Chapitre 10International audienceThe aim of this chapter is to present a multi-scale representation for spherical data with Poisson noise called Multi-Scale Variance Stabilizing Transform on the Sphere (MS-VSTS) [14], combining the MS-VST [25] with various multi-scale transforms on the sphere (wavelets and curvelets) [22, 2, 3]. Section 1.2 presents some multi-scale transforms on the sphere. Section 1.3 introduces a new multi-scale representation for data with Poisson noise called MS-VSTS. Section 1.4 applies this representation to Poisson noise removal on Fermi data. Section 1.5 presents applications to missing data interpolation and source extraction. Section 1.6 extends the method to multichannel data

    Poisson noise removal in multivariate count data

    Get PDF
    International audienceThe Multi-scale Variance Stabilization Transform (MSVST) has recently been proposed for 2D Poisson data denoising.1 In this work, we present an extension of the MSVST with the wavelet transform to multivariate data-each pixel is vector-valued-, where the vector field dimension may be the wavelength, the energy, or the time. Such data can be viewed naively as 3D data where the third dimension may be time, wavelength or energy (e.g. hyperspectral imaging). But this naive analysis using a 3D MSVST would be awkward as the data dimensions have different physical meanings. A more appropriate approach would be to use a wavelet transform, where the time or energy scale is not connected to the spatial scale. We show that our multivalued extension of MSVST can be used advantageously for approximately Gaussianizing and stabilizing the variance of a sequence of independent Poisson random vectors. This approach is shown to be fast and very well adapted to extremely low-count situations. We use a hypothesis testing framework in the wavelet domain to denoise the Gaussianized and stabilized coefficients, and then apply an iterative reconstruction algorithm to recover the estimated vector field of intensities underlying the Poisson data. Our approach is illustrated for the detection and characterization of astrophysical sources of high-energy gamma rays, using realistic simulated observations. We show that the multivariate MSVST permits efficient estimation across the time/energy dimension and immediate recovery of spectral properties

    Bayesian Methods for Metabolomics

    Get PDF
    Metabolomics, the large-scale study of small molecules, enables the underlying biochemical activity and state of cells or tissues to be directly captured. Nuclear Magnetic Resonance (NMR) Spectroscopy is one of the major data capturing tech- niques for metabolomics, as it provides highly reproducible, quantitative informa- tion on a wide variety of metabolites. This work presents possible solutions for three problems involved to aid the development of better algorithms for NMR data analy- sis. After reviewing relevant concepts and literature, we first utilise observed NMR chemical shift titration data for a range of urinary metabolites and develop a the- oretical model of chemical shift using a Bayesian statistical framework and model selection procedures to estimate the number of protonation sites, a key parameter to model the relationship between chemical shift variation and pH and usually un- known in uncatalogued metabolites. Secondly, with the aim of obtaining explicit concentration estimates for metabolites from NMR spectra, we discuss a Monte Carlo Co-ordinate Ascent Variational Inference (MC-CAVI) algorithm that com- bines Markov chain Monte Carlo (MCMC) methods with Co-ordinate Ascent VI (CAVI), demonstrate MC-CAVI’s suitability for models with hard constraints and compare MC-CAVI’s performance with that of MCMC in an important complex model used in NMR spectroscopy data analysis. The third distribution seeks to im- prove metabolite identification, one of the biggest bottlenecks in metabolomics and severely hindered by resonance overlapping in one-dimensional NMR spectroscopy. In particular, we present a novel Bayesian method for widely used two-dimensional (2D) 1H J-resolved (JRES) NMR spectroscopy, which has considerable potential to accurately identify and quantify metabolites within complex biological samples, through combining B-spline tight wavelet frames with theoretical templates. We then demonstrate the effectiveness of our approach via analyses of JRES datasets from serum and urine

    Bayesian Deconvolution and Quantification of Metabolites from J-Resolved NMR Spectroscopy

    Get PDF
    Two-dimensional (2D) nuclear magnetic resonance (nmr) methods have become increasingly popular in metabolomics, since they have considerable potential to accurately identify and quantify metabolites within complex biological samples. 2D 1 H J-resolved (jres) nmr spectroscopy is a widely used method that expands overlapping resonances into a second dimension. However, existing analytical processing methods do not fully exploit the information in the jres spectrum and, more importantly, do not provide measures of uncertainty associated with the estimates of quantities of interest, such as metabolite concentration. Combining the data-generating mechanisms and the extensive prior knowledge available in online databases, we develop a Bayesian method to analyse 2D jres data, which allows for automatic deconvolution, identification and quantification of metabolites. The model extends and improves previous work on one-dimensional nmr spectral data. Our approach is based on a combination of B-spline tight wavelet frames and theoretical templates, and thus enables the automatic incorporation of expert knowledge within the inferential framework. Posterior inference is performed through specially devised Markov chain Monte Carlo methods. We demonstrate the performance of our approach via analyses of datasets from serum and urine, showing the advantages of our proposed approach in terms of identification and quantification of metabolites

    SONAR Images Denoising

    Get PDF
    International audienc

    The SURE-LET approach to image denoising

    Get PDF
    Denoising is an essential step prior to any higher-level image-processing tasks such as segmentation or object tracking, because the undesirable corruption by noise is inherent to any physical acquisition device. When the measurements are performed by photosensors, one usually distinguish between two main regimes: in the first scenario, the measured intensities are sufficiently high and the noise is assumed to be signal-independent. In the second scenario, only few photons are detected, which leads to a strong signal-dependent degradation. When the noise is considered as signal-independent, it is often modeled as an additive independent (typically Gaussian) random variable, whereas, otherwise, the measurements are commonly assumed to follow independent Poisson laws, whose underlying intensities are the unknown noise-free measures. We first consider the reduction of additive white Gaussian noise (AWGN). Contrary to most existing denoising algorithms, our approach does not require an explicit prior statistical modeling of the unknown data. Our driving principle is the minimization of a purely data-adaptive unbiased estimate of the mean-squared error (MSE) between the processed and the noise-free data. In the AWGN case, such a MSE estimate was first proposed by Stein, and is known as "Stein's unbiased risk estimate" (SURE). We further develop the original SURE theory and propose a general methodology for fast and efficient multidimensional image denoising, which we call the SURE-LET approach. While SURE allows the quantitative monitoring of the denoising quality, the flexibility and the low computational complexity of our approach are ensured by a linear parameterization of the denoising process, expressed as a linear expansion of thresholds (LET).We propose several pointwise, multivariate, and multichannel thresholding functions applied to arbitrary (in particular, redundant) linear transformations of the input data, with a special focus on multiscale signal representations. We then transpose the SURE-LET approach to the estimation of Poisson intensities degraded by AWGN. The signal-dependent specificity of the Poisson statistics leads to the derivation of a new unbiased MSE estimate that we call "Poisson's unbiased risk estimate" (PURE) and requires more adaptive transform-domain thresholding rules. In a general PURE-LET framework, we first devise a fast interscale thresholding method restricted to the use of the (unnormalized) Haar wavelet transform. We then lift this restriction and show how the PURE-LET strategy can be used to design and optimize a wide class of nonlinear processing applied in an arbitrary (in particular, redundant) transform domain. We finally apply some of the proposed denoising algorithms to real multidimensional fluorescence microscopy images. Such in vivo imaging modality often operates under low-illumination conditions and short exposure time; consequently, the random fluctuations of the measured fluorophore radiations are well described by a Poisson process degraded (or not) by AWGN. We validate experimentally this statistical measurement model, and we assess the performance of the PURE-LET algorithms in comparison with some state-of-the-art denoising methods. Our solution turns out to be very competitive both qualitatively and computationally, allowing for a fast and efficient denoising of the huge volumes of data that are nowadays routinely produced in biomedical imaging

    Multiresolution image models and estimation techniques

    Get PDF

    Sparsity constraints for hyperspectral data analysis: linear mixture model and beyond

    Get PDF
    The recent development of multi-channel sensors has motivated interest in devising new methods for the coherent processing of multivariate data. An extensive work has already been dedicated to multivariate data processing ranging from blind source separation (BSS) to multi/hyper-spectral data restoration. Previous work has emphasized on the fundamental role played by sparsity and morphological diversity to enhance multichannel signal processing. GMCA is a recent algorithm for multichannel data analysis which was used successfully in a variety of applications including multichannel sparse decomposition, blind source separation (BSS), color image restoration and inpainting. Inspired by GMCA, a recently introduced algorithm coined HypGMCA is described for BSS applications in hyperspectral data processing. It assumes the collected data is a linear instantaneous mixture of components exhibiting sparse spectral signatures as well as sparse spatial morphologies, each in specified dictionaries of spectral and spatial waveforms. We report on numerical experiments with synthetic data and application to real observations which demonstrate the validity of the proposed method
    • …
    corecore