5 research outputs found

    A non-adapted sparse approximation of PDEs with stochastic inputs

    Get PDF
    We propose a method for the approximation of solutions of PDEs with stochastic coefficients based on the direct, i.e., non-adapted, sampling of solutions. This sampling can be done by using any legacy code for the deterministic problem as a black box. The method converges in probability (with probabilistic error bounds) as a consequence of sparsity and a concentration of measure phenomenon on the empirical correlation between samples. We show that the method is well suited for truly high-dimensional problems (with slow decay in the spectrum)

    Analyzing Weighted â„“_1 Minimization for Sparse Recovery With Nonuniform Sparse Models

    Get PDF
    In this paper, we introduce a nonuniform sparsity model and analyze the performance of an optimized weighted â„“_1 minimization over that sparsity model. In particular, we focus on a model where the entries of the unknown vector fall into two sets, with entries of each set having a specific probability of being nonzero. We propose a weighted â„“_1 minimization recovery algorithm and analyze its performance using a Grassmann angle approach. We compute explicitly the relationship between the system parameters-the weights, the number of measurements, the size of the two sets, the probabilities of being nonzero-so that when i.i.d. random Gaussian measurement matrices are used, the weighted â„“_1 minimization recovers a randomly selected signal drawn from the considered sparsity model with overwhelming probability as the problem dimension increases. This allows us to compute the optimal weights. We demonstrate through rigorous analysis and simulations that for the case when the support of the signal can be divided into two different subclasses with unequal sparsity fractions, the weighted â„“_1 minimization outperforms the regular â„“_1 minimization substantially. We also generalize our results to signal vectors with an arbitrary number of subclasses for sparsity
    corecore