16,547 research outputs found

    Astronomical Data Analysis and Sparsity: from Wavelets to Compressed Sensing

    Get PDF
    Wavelets have been used extensively for several years now in astronomy for many purposes, ranging from data filtering and deconvolution, to star and galaxy detection or cosmic ray removal. More recent sparse representations such ridgelets or curvelets have also been proposed for the detection of anisotropic features such cosmic strings in the cosmic microwave background. We review in this paper a range of methods based on sparsity that have been proposed for astronomical data analysis. We also discuss what is the impact of Compressed Sensing, the new sampling theory, in astronomy for collecting the data, transferring them to the earth or reconstructing an image from incomplete measurements.Comment: Submitted. Full paper will figures available at http://jstarck.free.fr/IEEE09_SparseAstro.pd

    Particle detection and tracking in fluorescence time-lapse imaging: a contrario approach

    Full text link
    This paper proposes a probabilistic approach for the detection and the tracking of particles in fluorescent time-lapse imaging. In the presence of a very noised and poor-quality data, particles and trajectories can be characterized by an a contrario model, that estimates the probability of observing the structures of interest in random data. This approach, first introduced in the modeling of human visual perception and then successfully applied in many image processing tasks, leads to algorithms that neither require a previous learning stage, nor a tedious parameter tuning and are very robust to noise. Comparative evaluations against a well-established baseline show that the proposed approach outperforms the state of the art.Comment: Published in Journal of Machine Vision and Application

    Inferring Latent States and Refining Force Estimates via Hierarchical Dirichlet Process Modeling in Single Particle Tracking Experiments

    Get PDF
    Optical microscopy provides rich spatio-temporal information characterizing in vivo molecular motion. However, effective forces and other parameters used to summarize molecular motion change over time in live cells due to latent state changes, e.g., changes induced by dynamic micro-environments, photobleaching, and other heterogeneity inherent in biological processes. This study focuses on techniques for analyzing Single Particle Tracking (SPT) data experiencing abrupt state changes. We demonstrate the approach on GFP tagged chromatids experiencing metaphase in yeast cells and probe the effective forces resulting from dynamic interactions that reflect the sum of a number of physical phenomena. State changes are induced by factors such as microtubule dynamics exerting force through the centromere, thermal polymer fluctuations, etc. Simulations are used to demonstrate the relevance of the approach in more general SPT data analyses. Refined force estimates are obtained by adopting and modifying a nonparametric Bayesian modeling technique, the Hierarchical Dirichlet Process Switching Linear Dynamical System (HDP-SLDS), for SPT applications. The HDP-SLDS method shows promise in systematically identifying dynamical regime changes induced by unobserved state changes when the number of underlying states is unknown in advance (a common problem in SPT applications). We expand on the relevance of the HDP-SLDS approach, review the relevant background of Hierarchical Dirichlet Processes, show how to map discrete time HDP-SLDS models to classic SPT models, and discuss limitations of the approach. In addition, we demonstrate new computational techniques for tuning hyperparameters and for checking the statistical consistency of model assumptions directly against individual experimental trajectories; the techniques circumvent the need for "ground-truth" and subjective information.Comment: 25 pages, 6 figures. Differs only typographically from PLoS One publication available freely as an open-access article at http://journals.plos.org/plosone/article?id=10.1371/journal.pone.013763

    High-ISO long-exposure image denoising based on quantitative blob characterization

    Get PDF
    Blob detection and image denoising are fundamental, sometimes related tasks in computer vision. In this paper, we present a computational method to quantitatively measure blob characteristics using normalized unilateral second-order Gaussian kernels. This method suppresses non-blob structures while yielding a quantitative measurement of the position, prominence and scale of blobs, which can facilitate the tasks of blob reconstruction and blob reduction. Subsequently, we propose a denoising scheme to address high-ISO long-exposure noise, which sometimes spatially shows a blob appearance, employing a blob reduction procedure as a cheap preprocessing for conventional denoising methods. We apply the proposed denoising methods to real-world noisy images as well as standard images that are corrupted by real noise. The experimental results demonstrate the superiority of the proposed methods over state-of-the-art denoising methods

    Multiscale Discriminant Saliency for Visual Attention

    Full text link
    The bottom-up saliency, an early stage of humans' visual attention, can be considered as a binary classification problem between center and surround classes. Discriminant power of features for the classification is measured as mutual information between features and two classes distribution. The estimated discrepancy of two feature classes very much depends on considered scale levels; then, multi-scale structure and discriminant power are integrated by employing discrete wavelet features and Hidden markov tree (HMT). With wavelet coefficients and Hidden Markov Tree parameters, quad-tree like label structures are constructed and utilized in maximum a posterior probability (MAP) of hidden class variables at corresponding dyadic sub-squares. Then, saliency value for each dyadic square at each scale level is computed with discriminant power principle and the MAP. Finally, across multiple scales is integrated the final saliency map by an information maximization rule. Both standard quantitative tools such as NSS, LCC, AUC and qualitative assessments are used for evaluating the proposed multiscale discriminant saliency method (MDIS) against the well-know information-based saliency method AIM on its Bruce Database wity eye-tracking data. Simulation results are presented and analyzed to verify the validity of MDIS as well as point out its disadvantages for further research direction.Comment: 16 pages, ICCSA 2013 - BIOCA sessio

    Weak Lensing Mass Reconstruction using Wavelets

    Full text link
    This paper presents a new method for the reconstruction of weak lensing mass maps. It uses the multiscale entropy concept, which is based on wavelets, and the False Discovery Rate which allows us to derive robust detection levels in wavelet space. We show that this new restoration approach outperforms several standard techniques currently used for weak shear mass reconstruction. This method can also be used to separate E and B modes in the shear field, and thus test for the presence of residual systematic effects. We concentrate on large blind cosmic shear surveys, and illustrate our results using simulated shear maps derived from N-Body Lambda-CDM simulations with added noise corresponding to both ground-based and space-based observations.Comment: Accepted manuscript with all figures can be downloaded at: http://jstarck.free.fr/aa_wlens05.pdf and software can be downloaded at http://jstarck.free.fr/mrlens.htm
    • …
    corecore