5,466 research outputs found

    A multiresolution approach to time warping achieved by a Bayesian prior-posterior transfer fitting strategy.

    Get PDF
    The procedure known as warping aims at reducing phase variability in a sample of functional curve observations, by applying a smooth bijection to the argument of each of the functions. We propose a natural representation of warping functions in terms of a new type of elementary function named `warping component functions' which are combined into the warping function by composition. A sequential Bayesian estimation strategy is introduced, which fits a series of models and transfers the posterior of the previous fit into the prior of the next fit. Model selection is based on a warping analogue to wavelet thresholding, combined with Bayesian inference.Bayesian inference; Functional data analysis; Markov chain Monte Carlo sampling; Time warping; Warping components; Warping function;

    Wavelet Estimators in Nonparametric Regression: A Comparative Simulation Study

    Get PDF
    Wavelet analysis has been found to be a powerful tool for the nonparametric estimation of spatially-variable objects. We discuss in detail wavelet methods in nonparametric regression, where the data are modelled as observations of a signal contaminated with additive Gaussian noise, and provide an extensive review of the vast literature of wavelet shrinkage and wavelet thresholding estimators developed to denoise such data. These estimators arise from a wide range of classical and empirical Bayes methods treating either individual or blocks of wavelet coefficients. We compare various estimators in an extensive simulation study on a variety of sample sizes, test functions, signal-to-noise ratios and wavelet filters. Because there is no single criterion that can adequately summarise the behaviour of an estimator, we use various criteria to measure performance in finite sample situations. Insight into the performance of these estimators is obtained from graphical outputs and numerical tables. In order to provide some hints of how these estimators should be used to analyse real data sets, a detailed practical step-by-step illustration of a wavelet denoising analysis on electrical consumption is provided. Matlab codes are provided so that all figures and tables in this paper can be reproduced

    Empirical Bayes selection of wavelet thresholds

    Full text link
    This paper explores a class of empirical Bayes methods for level-dependent threshold selection in wavelet shrinkage. The prior considered for each wavelet coefficient is a mixture of an atom of probability at zero and a heavy-tailed density. The mixing weight, or sparsity parameter, for each level of the transform is chosen by marginal maximum likelihood. If estimation is carried out using the posterior median, this is a random thresholding procedure; the estimation can also be carried out using other thresholding rules with the same threshold. Details of the calculations needed for implementing the procedure are included. In practice, the estimates are quick to compute and there is software available. Simulations on the standard model functions show excellent performance, and applications to data drawn from various fields of application are used to explore the practical performance of the approach. By using a general result on the risk of the corresponding marginal maximum likelihood approach for a single sequence, overall bounds on the risk of the method are found subject to membership of the unknown function in one of a wide range of Besov classes, covering also the case of f of bounded variation. The rates obtained are optimal for any value of the parameter p in (0,\infty], simultaneously for a wide range of loss functions, each dominating the L_q norm of the \sigmath derivative, with \sigma\ge0 and 0<q\le2.Comment: Published at http://dx.doi.org/10.1214/009053605000000345 in the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Bayesian Estimation of Intensity Surfaces on the Sphere via Needlet Shrinkage and Selection

    Get PDF
    This paper describes an approach for Bayesian modeling in spherical datasets. Our method is based upon a recent construction called the needlet, which is a particular form of spherical wavelet with many favorable statistical and computational properties. We perform shrinkage and selection of needlet coefficients, focusing on two main alternatives: empirical-Bayes thresholding, and Bayesian local shrinkage rules. We study the performance of the proposed methodology both on simulated data and on two real data sets: one involving the cosmic microwave background radiation, and one involving the reconstruction of a global news intensity surface inferred from published Reuters articles in August, 1996. The fully Bayesian approach based on robust, sparse shrinkage priors seems to outperform other alternatives.Business Administratio

    Bayesian Nonparametric Shrinkage Applied to Cepheid Star Oscillations

    Full text link
    Bayesian nonparametric regression with dependent wavelets has dual shrinkage properties: there is shrinkage through a dependent prior put on functional differences, and shrinkage through the setting of most of the wavelet coefficients to zero through Bayesian variable selection methods. The methodology can deal with unequally spaced data and is efficient because of the existence of fast moves in model space for the MCMC computation. The methodology is illustrated on the problem of modeling the oscillations of Cepheid variable stars; these are a class of pulsating variable stars with the useful property that their periods of variability are strongly correlated with their absolute luminosity. Once this relationship has been calibrated, knowledge of the period gives knowledge of the luminosity. This makes these stars useful as "standard candles" for estimating distances in the universe.Comment: Published in at http://dx.doi.org/10.1214/11-STS384 the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Regression in random design and Bayesian warped wavelets estimators

    Get PDF
    In this paper we deal with the regression problem in a random design setting. We investigate asymptotic optimality under minimax point of view of various Bayesian rules based on warped wavelets and show that they nearly attain optimal minimax rates of convergence over the Besov smoothness class considered. Warped wavelets have been introduced recently, they offer very good computable and easy-to-implement properties while being well adapted to the statistical problem at hand. We particularly put emphasis on Bayesian rules leaning on small and large variance Gaussian priors and discuss their simulation performances comparing them with a hard thresholding procedure

    Minimax estimation with thresholding and its application to wavelet analysis

    Full text link
    Many statistical practices involve choosing between a full model and reduced models where some coefficients are reduced to zero. Data were used to select a model with estimated coefficients. Is it possible to do so and still come up with an estimator always better than the traditional estimator based on the full model? The James-Stein estimator is such an estimator, having a property called minimaxity. However, the estimator considers only one reduced model, namely the origin. Hence it reduces no coefficient estimator to zero or every coefficient estimator to zero. In many applications including wavelet analysis, what should be more desirable is to reduce to zero only the estimators smaller than a threshold, called thresholding in this paper. Is it possible to construct this kind of estimators which are minimax? In this paper, we construct such minimax estimators which perform thresholding. We apply our recommended estimator to the wavelet analysis and show that it performs the best among the well-known estimators aiming simultaneously at estimation and model selection. Some of our estimators are also shown to be asymptotically optimal.Comment: Published at http://dx.doi.org/10.1214/009053604000000977 in the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org
    corecore