2,841 research outputs found

    The locally stationary dual-tree complex wavelet model

    Get PDF
    We here harmonise two significant contributions to the field of wavelet analysis in the past two decades, namely the locally stationary wavelet process and the family of dual-tree complex wavelets. By combining these two components, we furnish a statistical model that can simultaneously access benefits from these two constructions. On the one hand, our model borrows the debiased spectrum and auto-covariance estimator from the locally stationary wavelet model. On the other hand, the enhanced directional selectivity is obtained from the dual-tree complex wavelets over the regular lattice. The resulting model allows for the description and identification of wavelet fields with significantly more directional fidelity than was previously possible. The corresponding estimation theory is established for the new model, and some stationarity detection experiments illustrate its practicality

    Introducing the locally stationary dual-tree complex wavelet model

    Get PDF
    This paper reconciles Kingsbury's dual-tree complex wavelets with Nason and Eckley's locally stationary model. We here establish that the dual-tree wavelets admit an invertible de-biasing matrix and that this matrix can be used to invert the covariance relation. We also show that the added directional selectivity of the proposed model adds utility to the standard two-dimensional local stationary model. Non-stationarity detection on random fields is used as a motivating example. Experiments confirm that the dual-tree model can distinguish anisotropic non-stationarities significantly better than the current model

    Bayesian demosaicing using Gaussian scale mixture priors with local adaptivity in the dual tree complex wavelet packet transform domain

    Get PDF
    In digital cameras and mobile phones, there is an ongoing trend to increase the image resolution, decrease the sensor size and to use lower exposure times. Because smaller sensors inherently lead to more noise and a worse spatial resolution, digital post-processing techniques are required to resolve many of the artifacts. Color filter arrays (CFAs), which use alternating patterns of color filters, are very popular because of price and power consumption reasons. However, color filter arrays require the use of a post-processing technique such as demosaicing to recover full resolution RGB images. Recently, there has been some interest in techniques that jointly perform the demosaicing and denoising. This has the advantage that the demosaicing and denoising can be performed optimally (e.g. in the MSE sense) for the considered noise model, while avoiding artifacts introduced when using demosaicing and denoising sequentially. ABSTRACT In this paper, we will continue the research line of the wavelet-based demosaicing techniques. These approaches are computationally simple and very suited for combination with denoising. Therefore, we will derive Bayesian Minimum Squared Error (MMSE) joint demosaicing and denoising rules in the complex wavelet packet domain, taking local adaptivity into account. As an image model, we will use Gaussian Scale Mixtures, thereby taking advantage of the directionality of the complex wavelets. Our results show that this technique is well capable of reconstructing fine details in the image, while removing all of the noise, at a relatively low computational cost. In particular, the complete reconstruction (including color correction, white balancing etc) of a 12 megapixel RAW image takes 3.5 sec on a recent mid-range GPU

    Semi-local scaling exponent estimation with box-penalty constraints and total-variation regularisation

    Get PDF
    We here establish and exploit the result that 2-D isotropic self-similar fields beget quasi-decorrelated wavelet coefficients and that the resulting localised log sample second moment statistic is asymptotically normal. This leads to the development of a semi-local scaling exponent estimation framework with optimally modified weights. Furthermore, recent interest in penalty methods for least squares problems and generalised Lasso for scaling exponent estimation inspires the simultaneous incorporation of both bounding box constraints and total variation smoothing into an iteratively reweighted least-squares estimator framework. Numerical results on fractional Brownian fields with global and piecewise constant, semi-local Hurst parameters illustrate the benefits of the new estimators

    Multiresolution image models and estimation techniques

    Get PDF

    SONAR Images Denoising

    Get PDF
    International audienc

    Array CGH data modeling and smoothing in Stationary Wavelet Packet Transform domain

    Get PDF
    Background: Array-based comparative genomic hybridization (array CGH) is a highly efficient technique, allowing the simultaneous measurement of genomic DNA copy number at hundreds or thousands of loci and the reliable detection of local one-copy-level variations. Characterization of these DNA copy number changes is important for both the basic understanding of cancer and its diagnosis. In order to develop effective methods to identify aberration regions from array CGH data, many recent research work focus on both smoothing-based and segmentation-based data processing. In this paper, we propose stationary packet wavelet transform based approach to smooth array CGH data. Our purpose is to remove CGH noise in whole frequency while keeping true signal by using bivariate model. Results: In both synthetic and real CGH data, Stationary Wavelet Packet Transform (SWPT) is the best wavelet transform to analyze CGH signal in whole frequency. We also introduce a new bivariate shrinkage model which shows the relationship of CGH noisy coefficients of two scales in SWPT. Before smoothing, the symmetric extension is considered as a preprocessing step to save information at the border. Conclusions: We have designed the SWTP and the SWPT-Bi which are using the stationary wavelet packet transform with the hard thresholding and the new bivariate shrinkage estimator respectively to smooth the array CGH data. We demonstrate the effectiveness of our approach through theoretical and experimental exploration of a set of array CGH data, including both synthetic data and real data. The comparison results show that our method outperforms the previous approaches

    Lifting dual tree complex wavelets transform

    Get PDF
    We describe the lifting dual tree complex wavelet transform (LDTCWT), a type of lifting wavelets remodeling that produce complex coefficients by employing a dual tree of lifting wavelets filters to get its real part and imaginary part. Permits the remodel to produce approximate shift invariance, directionally selective filters and reduces the computation time (properties lacking within the classical wavelets transform). We describe a way to estimate the accuracy of this approximation and style appropriate filters to attain this. These benefits are often exploited among applications like denoising, segmentation, image fusion and compression. The results of applications shrinkage denoising demonstrate objective and subjective enhancements over the dual tree complex wavelet transform (DTCWT). The results of the shrinkage denoising example application indicate empirical and subjective enhancements over the DTCWT. The new transform with the DTCWT provide a trade-off between denoising computational competence of performance, and memory necessities. We tend to use the PSNR (peak signal to noise ratio) alongside the structural similarity index measure (SSIM) and the SSIM map to estimate denoised image quality

    Functional Regression

    Full text link
    Functional data analysis (FDA) involves the analysis of data whose ideal units of observation are functions defined on some continuous domain, and the observed data consist of a sample of functions taken from some population, sampled on a discrete grid. Ramsay and Silverman's 1997 textbook sparked the development of this field, which has accelerated in the past 10 years to become one of the fastest growing areas of statistics, fueled by the growing number of applications yielding this type of data. One unique characteristic of FDA is the need to combine information both across and within functions, which Ramsay and Silverman called replication and regularization, respectively. This article will focus on functional regression, the area of FDA that has received the most attention in applications and methodological development. First will be an introduction to basis functions, key building blocks for regularization in functional regression methods, followed by an overview of functional regression methods, split into three types: [1] functional predictor regression (scalar-on-function), [2] functional response regression (function-on-scalar) and [3] function-on-function regression. For each, the role of replication and regularization will be discussed and the methodological development described in a roughly chronological manner, at times deviating from the historical timeline to group together similar methods. The primary focus is on modeling and methodology, highlighting the modeling structures that have been developed and the various regularization approaches employed. At the end is a brief discussion describing potential areas of future development in this field
    corecore