45 research outputs found

    Wavelet Shrinkage: Unification of Basic Thresholding Functions and Thresholds

    No full text
    International audienceThis work addresses the unification of some basic functions and thresholds used in non-parametric estimation of signals by shrinkage in the wavelet domain. The Soft and Hard thresholding functions are presented as degenerate \emph{smooth sigmoid based shrinkage} functions. The shrinkage achieved by this new family of sigmoid based functions is then shown to be equivalent to a regularisation of wavelet coefficients associated with a class of penalty functions. Some sigmoid based penalty functions are calculated, and their properties are discussed. The unification also concerns the universal and the minimax thresholds used to calibrate standard Soft and Hard thresholding functions: these thresholds pertain to a wide class of thresholds, called the detection thresholds. These thresholds depend on two parameters describing the sparsity degree for the wavelet representation of a signal. It is also shown that the non-degenerate sigmoid shrinkage adjusted with the new detection thresholds is as performant as the best up-to-date parametric and computationally expensive method. This justifies the relevance of sigmoid shrinkage for noise reduction in large databases or large size images

    Algorithms based on sparsity hypotheses for robust estimation of the noise standard deviation in presence of signals with unknown distributions and concurrences

    No full text
    Inmany applications, d-dimensional observations result fromthe randompresenceor absence of randomsignals in independent and additivewhite Gaussiannoise. An estimate of the noise standard deviation can then be very useful todetect or to estimate these signals, especially when standard likelihood theory cannot apply because of too little prior knowledge about the signal probability distributions. Recent results and algorithms have then emphasized the interest of sparsity hypotheses to design robust estimators of the noise standard deviation when signals have unknown distributions. As a continuation, the present paper introduces a new robust estimator for signals with probabilities of presence less than or equal to one half. In contrast to the standard MAD estimator, it applies whatever the value of d. This algorithm is applied to image denoising by wavelet shrinkage as well as to non-cooperative detection of radiocommunications.In both cases, the estimator proposed in the present paper outperforms the standard solutions used in such applications and based on the MAD estimator. The Matlab code corresponding to the proposed estimator is available at http://perso.telecom-bretagne.eu/pasto

    Wavelet Estimators in Nonparametric Regression: A Comparative Simulation Study

    Get PDF
    Wavelet analysis has been found to be a powerful tool for the nonparametric estimation of spatially-variable objects. We discuss in detail wavelet methods in nonparametric regression, where the data are modelled as observations of a signal contaminated with additive Gaussian noise, and provide an extensive review of the vast literature of wavelet shrinkage and wavelet thresholding estimators developed to denoise such data. These estimators arise from a wide range of classical and empirical Bayes methods treating either individual or blocks of wavelet coefficients. We compare various estimators in an extensive simulation study on a variety of sample sizes, test functions, signal-to-noise ratios and wavelet filters. Because there is no single criterion that can adequately summarise the behaviour of an estimator, we use various criteria to measure performance in finite sample situations. Insight into the performance of these estimators is obtained from graphical outputs and numerical tables. In order to provide some hints of how these estimators should be used to analyse real data sets, a detailed practical step-by-step illustration of a wavelet denoising analysis on electrical consumption is provided. Matlab codes are provided so that all figures and tables in this paper can be reproduced

    Essays on the nonlinear and nonstochastic nature of stock market data

    Get PDF
    The nature and structure of stock-market price dynamics is an area of ongoing and rigourous scientific debate. For almost three decades, most emphasis has been given on upholding the concepts of Market Efficiency and rational investment behaviour. Such an approach has favoured the development of numerous linear and nonlinear models mainly of stochastic foundations. Advances in mathematics have shown that nonlinear deterministic processes i.e. "chaos" can produce sequences that appear random to linear statistical techniques. Till recently, investment finance has been a science based on linearity and stochasticity. Hence it is important that studies of Market Efficiency include investigations of chaotic determinism and power laws. As far as chaos is concerned, there are rather mixed or inconclusive research results, prone with controversy. This inconclusiveness is attributed to two things: the nature of stock market time series, which are highly volatile and contaminated with a substantial amount of noise of largely unknown structure, and the lack of appropriate robust statistical testing procedures. In order to overcome such difficulties, within this thesis it is shown empirically and for the first time how one can combine novel techniques from recent chaotic and signal analysis literature, under a univariate time series analysis framework. Three basic methodologies are investigated: Recurrence analysis, Surrogate Data and Wavelet transforms. Recurrence Analysis is used to reveal qualitative and quantitative evidence of nonlinearity and nonstochasticity for a number of stock markets. It is then demonstrated how Surrogate Data, under a statistical hypothesis testing framework, can be simulated to provide similar evidence. Finally, it is shown how wavelet transforms can be applied in order to reveal various salient features of the market data and provide a platform for nonparametric regression and denoising. The results indicate that without the invocation of any parametric model-based assumptions, one can easily deduce that there is more to linearity and stochastic randomness in the data. Moreover, substantial evidence of recurrent patterns and aperiodicities is discovered which can be attributed to chaotic dynamics. These results are therefore very consistent with existing research indicating some types of nonlinear dependence in financial data. Concluding, the value of this thesis lies in its contribution to the overall evidence on Market Efficiency and chaotic determinism in financial markets. The main implication here is that the theory of equilibrium pricing in financial markets may need reconsideration in order to accommodate for the structures revealed
    corecore