19 research outputs found

    Bayesian Lower Bounds for Dense or Sparse (Outlier) Noise in the RMT Framework

    Full text link
    Robust estimation is an important and timely research subject. In this paper, we investigate performance lower bounds on the mean-square-error (MSE) of any estimator for the Bayesian linear model, corrupted by a noise distributed according to an i.i.d. Student's t-distribution. This class of prior parametrized by its degree of freedom is relevant to modelize either dense or sparse (accounting for outliers) noise. Using the hierarchical Normal-Gamma representation of the Student's t-distribution, the Van Trees' Bayesian Cram\'er-Rao bound (BCRB) on the amplitude parameters is derived. Furthermore, the random matrix theory (RMT) framework is assumed, i.e., the number of measurements and the number of unknown parameters grow jointly to infinity with an asymptotic finite ratio. Using some powerful results from the RMT, closed-form expressions of the BCRB are derived and studied. Finally, we propose a framework to fairly compare two models corrupted by noises with different degrees of freedom for a fixed common target signal-to-noise ratio (SNR). In particular, we focus our effort on the comparison of the BCRBs associated with two models corrupted by a sparse noise promoting outliers and a dense (Gaussian) noise, respectively

    Bayesian Estimation for Continuous-Time Sparse Stochastic Processes

    Full text link
    We consider continuous-time sparse stochastic processes from which we have only a finite number of noisy/noiseless samples. Our goal is to estimate the noiseless samples (denoising) and the signal in-between (interpolation problem). By relying on tools from the theory of splines, we derive the joint a priori distribution of the samples and show how this probability density function can be factorized. The factorization enables us to tractably implement the maximum a posteriori and minimum mean-square error (MMSE) criteria as two statistical approaches for estimating the unknowns. We compare the derived statistical methods with well-known techniques for the recovery of sparse signals, such as the â„“1\ell_1 norm and Log (â„“1\ell_1-â„“0\ell_0 relaxation) regularization methods. The simulation results show that, under certain conditions, the performance of the regularization techniques can be very close to that of the MMSE estimator.Comment: To appear in IEEE TS

    On the Continuity of Characteristic Functionals and Sparse Stochastic Modeling

    Full text link
    The characteristic functional is the infinite-dimensional generalization of the Fourier transform for measures on function spaces. It characterizes the statistical law of the associated stochastic process in the same way as a characteristic function specifies the probability distribution of its corresponding random variable. Our goal in this work is to lay the foundations of the innovation model, a (possibly) non-Gaussian probabilistic model for sparse signals. This is achieved by using the characteristic functional to specify sparse stochastic processes that are defined as linear transformations of general continuous-domain white noises (also called innovation processes). We prove the existence of a broad class of sparse processes by using the Minlos-Bochner theorem. This requires a careful study of the regularity properties, especially the boundedness in Lp-spaces, of the characteristic functional of the innovations. We are especially interested in the functionals that are only defined for p<1 since they appear to be associated with the sparser kind of processes. Finally, we apply our main theorem of existence to two specific subclasses of processes with specific invariance properties.Comment: 24 page

    Generating Sparse Stochastic Processes Using Matched Splines

    Full text link
    We provide an algorithm to generate trajectories of sparse stochastic processes that are solutions of linear ordinary differential equations driven by L\'evy white noises. A recent paper showed that these processes are limits in law of generalized compound-Poisson processes. Based on this result, we derive an off-the-grid algorithm that generates arbitrarily close approximations of the target process. Our method relies on a B-spline representation of generalized compound-Poisson processes. We illustrate numerically the validity of our approach

    Wavelet Shrinkage With Consistent Cycle Spinning Generalizes Total Variation Denoising

    Full text link
    corecore