82,094 research outputs found

    Sparse recovery with partial support knowledge

    Get PDF
    14th International Workshop, APPROX 2011, and 15th International Workshop, RANDOM 2011, Princeton, NJ, USA, August 17-19, 2011. ProceedingsThe goal of sparse recovery is to recover the (approximately) best k-sparse approximation [ˆ over x] of an n-dimensional vector x from linear measurements Ax of x. We consider a variant of the problem which takes into account partial knowledge about the signal. In particular, we focus on the scenario where, after the measurements are taken, we are given a set S of size s that is supposed to contain most of the “large” coefficients of x. The goal is then to find [ˆ over x] such that [ ||x-[ˆ over x]|| [subscript p] ≤ C min ||x-x'||[subscript q]. [over] k-sparse x' [over] supp (x') [c over _] S] We refer to this formulation as the sparse recovery with partial support knowledge problem ( SRPSK ). We show that SRPSK can be solved, up to an approximation factor of C = 1 + ε, using O( (k/ε) log(s/k)) measurements, for p = q = 2. Moreover, this bound is tight as long as s = O(εn / log(n/ε)). This completely resolves the asymptotic measurement complexity of the problem except for a very small range of the parameter s. To the best of our knowledge, this is the first variant of (1 + ε)-approximate sparse recovery for which the asymptotic measurement complexity has been determined.Space and Naval Warfare Systems Center San Diego (U.S.) (Contract N66001-11-C-4092)David & Lucile Packard Foundation (Fellowship)Center for Massive Data Algorithmics (MADALGO)National Science Foundation (U.S.) (Grant CCF-0728645)National Science Foundation (U.S.) (Grant CCF-1065125

    Coherence-based Partial Exact Recovery Condition for OMP/OLS

    Get PDF
    We address the exact recovery of the support of a k-sparse vector with Orthogonal Matching Pursuit (OMP) and Orthogonal Least Squares (OLS) in a noiseless setting. We consider the scenario where OMP/OLS have selected good atoms during the first l iterations (l<k) and derive a new sufficient and worst-case necessary condition for their success in k steps. Our result is based on the coherence \mu of the dictionary and relaxes Tropp's well-known condition \mu<1/(2k-1) to the case where OMP/OLS have a partial knowledge of the support

    On Modified l_1-Minimization Problems in Compressed Sensing

    Get PDF
    Sparse signal modeling has received much attention recently because of its application in medical imaging, group testing and radar technology, among others. Compressed sensing, a recently coined term, has showed us, both in theory and practice, that various signals of interest which are sparse or approximately sparse can be efficiently recovered by using far fewer samples than suggested by Shannon sampling theorem. Sparsity is the only prior information about an unknown signal assumed in traditional compressed sensing techniques. But in many applications, other kinds of prior information are also available, such as partial knowledge of the support, tree structure of signal and clustering of large coefficients around a small set of coefficients. In this thesis, we consider compressed sensing problems with prior information on the support of the signal, together with sparsity. We modify regular l_1 -minimization problems considered in compressed sensing, using this extra information. We call these modified l_1 -minimization problems. We show that partial knowledge of the support helps us to weaken sufficient conditions for the recovery of sparse signals using modified ` 1 minimization problems. In case of deterministic compressed sensing, we show that a sharp condition for sparse recovery can be improved using modified ` 1 minimization problems. We also derive algebraic necessary and sufficient condition for modified basis pursuit problem and use an open source algorithm known as l_1 -homotopy algorithm to perform some numerical experiments and compare the performance of modified Basis Pursuit Denoising with the regular Basis Pursuit Denoising

    Reliable recovery of hierarchically sparse signals for Gaussian and Kronecker product measurements

    Full text link
    We propose and analyze a solution to the problem of recovering a block sparse signal with sparse blocks from linear measurements. Such problems naturally emerge inter alia in the context of mobile communication, in order to meet the scalability and low complexity requirements of massive antenna systems and massive machine-type communication. We introduce a new variant of the Hard Thresholding Pursuit (HTP) algorithm referred to as HiHTP. We provide both a proof of convergence and a recovery guarantee for noisy Gaussian measurements that exhibit an improved asymptotic scaling in terms of the sampling complexity in comparison with the usual HTP algorithm. Furthermore, hierarchically sparse signals and Kronecker product structured measurements naturally arise together in a variety of applications. We establish the efficient reconstruction of hierarchically sparse signals from Kronecker product measurements using the HiHTP algorithm. Additionally, we provide analytical results that connect our recovery conditions to generalized coherence measures. Again, our recovery results exhibit substantial improvement in the asymptotic sampling complexity scaling over the standard setting. Finally, we validate in numerical experiments that for hierarchically sparse signals, HiHTP performs significantly better compared to HTP.Comment: 11+4 pages, 5 figures. V3: Incomplete funding information corrected and minor typos corrected. V4: Change of title and additional author Axel Flinth. Included new results on Kronecker product measurements and relations of HiRIP to hierarchical coherence measures. Improved presentation of general hierarchically sparse signals and correction of minor typo
    • …
    corecore