2,327 research outputs found

    Blind Curvelet based Denoising of Seismic Surveys in Coherent and Incoherent Noise Environments

    Full text link
    The localized nature of curvelet functions, together with their frequency and dip characteristics, makes the curvelet transform an excellent choice for processing seismic data. In this work, a denoising method is proposed based on a combination of the curvelet transform and a whitening filter along with procedure for noise variance estimation. The whitening filter is added to get the best performance of the curvelet transform under coherent and incoherent correlated noise cases, and furthermore, it simplifies the noise estimation method and makes it easy to use the standard threshold methodology without digging into the curvelet domain. The proposed method is tested on pseudo-synthetic data by adding noise to real noise-less data set of the Netherlands offshore F3 block and on the field data set from east Texas, USA, containing ground roll noise. Our experimental results show that the proposed algorithm can achieve the best results under all types of noises (incoherent or uncorrelated or random, and coherent noise)

    A multiresolution framework for local similarity based image denoising

    Get PDF
    In this paper, we present a generic framework for denoising of images corrupted with additive white Gaussian noise based on the idea of regional similarity. The proposed framework employs a similarity function using the distance between pixels in a multidimensional feature space, whereby multiple feature maps describing various local regional characteristics can be utilized, giving higher weight to pixels having similar regional characteristics. An extension of the proposed framework into a multiresolution setting using wavelets and scale space is presented. It is shown that the resulting multiresolution multilateral (MRM) filtering algorithm not only eliminates the coarse-grain noise but can also faithfully reconstruct anisotropic features, particularly in the presence of high levels of noise

    First order algorithms in variational image processing

    Get PDF
    Variational methods in imaging are nowadays developing towards a quite universal and flexible tool, allowing for highly successful approaches on tasks like denoising, deblurring, inpainting, segmentation, super-resolution, disparity, and optical flow estimation. The overall structure of such approaches is of the form D(Ku)+αR(u)minu{\cal D}(Ku) + \alpha {\cal R} (u) \rightarrow \min_u ; where the functional D{\cal D} is a data fidelity term also depending on some input data ff and measuring the deviation of KuKu from such and R{\cal R} is a regularization functional. Moreover KK is a (often linear) forward operator modeling the dependence of data on an underlying image, and α\alpha is a positive regularization parameter. While D{\cal D} is often smooth and (strictly) convex, the current practice almost exclusively uses nonsmooth regularization functionals. The majority of successful techniques is using nonsmooth and convex functionals like the total variation and generalizations thereof or 1\ell_1-norms of coefficients arising from scalar products with some frame system. The efficient solution of such variational problems in imaging demands for appropriate algorithms. Taking into account the specific structure as a sum of two very different terms to be minimized, splitting algorithms are a quite canonical choice. Consequently this field has revived the interest in techniques like operator splittings or augmented Lagrangians. Here we shall provide an overview of methods currently developed and recent results as well as some computational studies providing a comparison of different methods and also illustrating their success in applications.Comment: 60 pages, 33 figure

    PEAR: PEriodic And fixed Rank separation for fast fMRI

    Full text link
    In functional MRI (fMRI), faster acquisition via undersampling of data can improve the spatial-temporal resolution trade-off and increase statistical robustness through increased degrees-of-freedom. High quality reconstruction of fMRI data from undersampled measurements requires proper modeling of the data. We present an fMRI reconstruction approach based on modeling the fMRI signal as a sum of periodic and fixed rank components, for improved reconstruction from undersampled measurements. We decompose the fMRI signal into a component which a has fixed rank and a component consisting of a sum of periodic signals which is sparse in the temporal Fourier domain. Data reconstruction is performed by solving a constrained problem that enforces a fixed, moderate rank on one of the components, and a limited number of temporal frequencies on the other. Our approach is coined PEAR - PEriodic And fixed Rank separation for fast fMRI. Experimental results include purely synthetic simulation, a simulation with real timecourses and retrospective undersampling of a real fMRI dataset. Evaluation was performed both quantitatively and visually versus ground truth, comparing PEAR to two additional recent methods for fMRI reconstruction from undersampled measurements. Results demonstrate PEAR's improvement in estimating the timecourses and activation maps versus the methods compared against at acceleration ratios of R=8,16 (for simulated data) and R=6.66,10 (for real data). PEAR results in reconstruction with higher fidelity than when using a fixed-rank based model or a conventional Low-rank+Sparse algorithm. We have shown that splitting the functional information between the components leads to better modeling of fMRI, over state-of-the-art methods

    Novel hybrid extraction systems for fetal heart rate variability monitoring based on non-invasive fetal electrocardiogram

    Get PDF
    This study focuses on the design, implementation and subsequent verification of a new type of hybrid extraction system for noninvasive fetal electrocardiogram (NI-fECG) processing. The system designed combines the advantages of individual adaptive and non-adaptive algorithms. The pilot study reviews two innovative hybrid systems called ICA-ANFIS-WT and ICA-RLS-WT. This is a combination of independent component analysis (ICA), adaptive neuro-fuzzy inference system (ANFIS) algorithm or recursive least squares (RLS) algorithm and wavelet transform (WT) algorithm. The study was conducted on clinical practice data (extended ADFECGDB database and Physionet Challenge 2013 database) from the perspective of non-invasive fetal heart rate variability monitoring based on the determination of the overall probability of correct detection (ACC), sensitivity (SE), positive predictive value (PPV) and harmonic mean between SE and PPV (F1). System functionality was verified against a relevant reference obtained by an invasive way using a scalp electrode (ADFECGDB database), or relevant reference obtained by annotations (Physionet Challenge 2013 database). The study showed that ICA-RLS-WT hybrid system achieve better results than ICA-ANFIS-WT. During experiment on ADFECGDB database, the ICA-RLS-WT hybrid system reached ACC > 80 % on 9 recordings out of 12 and the ICA-ANFIS-WT hybrid system reached ACC > 80 % only on 6 recordings out of 12. During experiment on Physionet Challenge 2013 database the ICA-RLS-WT hybrid system reached ACC > 80 % on 13 recordings out of 25 and the ICA-ANFIS-WT hybrid system reached ACC > 80 % only on 7 recordings out of 25. Both hybrid systems achieve provably better results than the individual algorithms tested in previous studies.Web of Science713178413175
    corecore