23,758 research outputs found

    Off-the-Grid Line Spectrum Denoising and Estimation with Multiple Measurement Vectors

    Full text link
    Compressed Sensing suggests that the required number of samples for reconstructing a signal can be greatly reduced if it is sparse in a known discrete basis, yet many real-world signals are sparse in a continuous dictionary. One example is the spectrally-sparse signal, which is composed of a small number of spectral atoms with arbitrary frequencies on the unit interval. In this paper we study the problem of line spectrum denoising and estimation with an ensemble of spectrally-sparse signals composed of the same set of continuous-valued frequencies from their partial and noisy observations. Two approaches are developed based on atomic norm minimization and structured covariance estimation, both of which can be solved efficiently via semidefinite programming. The first approach aims to estimate and denoise the set of signals from their partial and noisy observations via atomic norm minimization, and recover the frequencies via examining the dual polynomial of the convex program. We characterize the optimality condition of the proposed algorithm and derive the expected convergence rate for denoising, demonstrating the benefit of including multiple measurement vectors. The second approach aims to recover the population covariance matrix from the partially observed sample covariance matrix by motivating its low-rank Toeplitz structure without recovering the signal ensemble. Performance guarantee is derived with a finite number of measurement vectors. The frequencies can be recovered via conventional spectrum estimation methods such as MUSIC from the estimated covariance matrix. Finally, numerical examples are provided to validate the favorable performance of the proposed algorithms, with comparisons against several existing approaches.Comment: 14 pages, 10 figure

    Group Lasso estimation of high-dimensional covariance matrices

    Get PDF
    In this paper, we consider the Group Lasso estimator of the covariance matrix of a stochastic process corrupted by an additive noise. We propose to estimate the covariance matrix in a high-dimensional setting under the assumption that the process has a sparse representation in a large dictionary of basis functions. Using a matrix regression model, we propose a new methodology for high-dimensional covariance matrix estimation based on empirical contrast regularization by a group Lasso penalty. Using such a penalty, the method selects a sparse set of basis functions in the dictionary used to approximate the process, leading to an approximation of the covariance matrix into a low dimensional space. Consistency of the estimator is studied in Frobenius and operator norms and an application to sparse PCA is proposed

    Info-Greedy sequential adaptive compressed sensing

    Full text link
    We present an information-theoretic framework for sequential adaptive compressed sensing, Info-Greedy Sensing, where measurements are chosen to maximize the extracted information conditioned on the previous measurements. We show that the widely used bisection approach is Info-Greedy for a family of kk-sparse signals by connecting compressed sensing and blackbox complexity of sequential query algorithms, and present Info-Greedy algorithms for Gaussian and Gaussian Mixture Model (GMM) signals, as well as ways to design sparse Info-Greedy measurements. Numerical examples demonstrate the good performance of the proposed algorithms using simulated and real data: Info-Greedy Sensing shows significant improvement over random projection for signals with sparse and low-rank covariance matrices, and adaptivity brings robustness when there is a mismatch between the assumed and the true distributions.Comment: Preliminary results presented at Allerton Conference 2014. To appear in IEEE Journal Selected Topics on Signal Processin
    corecore