689 research outputs found

    A* Orthogonal Matching Pursuit: Best-First Search for Compressed Sensing Signal Recovery

    Full text link
    Compressed sensing is a developing field aiming at reconstruction of sparse signals acquired in reduced dimensions, which make the recovery process under-determined. The required solution is the one with minimum â„“0\ell_0 norm due to sparsity, however it is not practical to solve the â„“0\ell_0 minimization problem. Commonly used techniques include â„“1\ell_1 minimization, such as Basis Pursuit (BP) and greedy pursuit algorithms such as Orthogonal Matching Pursuit (OMP) and Subspace Pursuit (SP). This manuscript proposes a novel semi-greedy recovery approach, namely A* Orthogonal Matching Pursuit (A*OMP). A*OMP performs A* search to look for the sparsest solution on a tree whose paths grow similar to the Orthogonal Matching Pursuit (OMP) algorithm. Paths on the tree are evaluated according to a cost function, which should compensate for different path lengths. For this purpose, three different auxiliary structures are defined, including novel dynamic ones. A*OMP also incorporates pruning techniques which enable practical applications of the algorithm. Moreover, the adjustable search parameters provide means for a complexity-accuracy trade-off. We demonstrate the reconstruction ability of the proposed scheme on both synthetically generated data and images using Gaussian and Bernoulli observation matrices, where A*OMP yields less reconstruction error and higher exact recovery frequency than BP, OMP and SP. Results also indicate that novel dynamic cost functions provide improved results as compared to a conventional choice.Comment: accepted for publication in Digital Signal Processin

    Improving A*OMP: Theoretical and Empirical Analyses With a Novel Dynamic Cost Model

    Full text link
    Best-first search has been recently utilized for compressed sensing (CS) by the A* orthogonal matching pursuit (A*OMP) algorithm. In this work, we concentrate on theoretical and empirical analyses of A*OMP. We present a restricted isometry property (RIP) based general condition for exact recovery of sparse signals via A*OMP. In addition, we develop online guarantees which promise improved recovery performance with the residue-based termination instead of the sparsity-based one. We demonstrate the recovery capabilities of A*OMP with extensive recovery simulations using the adaptive-multiplicative (AMul) cost model, which effectively compensates for the path length differences in the search tree. The presented results, involving phase transitions for different nonzero element distributions as well as recovery rates and average error, reveal not only the superior recovery accuracy of A*OMP, but also the improvements with the residue-based termination and the AMul cost model. Comparison of the run times indicate the speed up by the AMul cost model. We also demonstrate a hybrid of OMP and A?OMP to accelerate the search further. Finally, we run A*OMP on a sparse image to illustrate its recovery performance for more realistic coefcient distributions

    Recovery of Sparse Signals Using Multiple Orthogonal Least Squares

    Full text link
    We study the problem of recovering sparse signals from compressed linear measurements. This problem, often referred to as sparse recovery or sparse reconstruction, has generated a great deal of interest in recent years. To recover the sparse signals, we propose a new method called multiple orthogonal least squares (MOLS), which extends the well-known orthogonal least squares (OLS) algorithm by allowing multiple LL indices to be chosen per iteration. Owing to inclusion of multiple support indices in each selection, the MOLS algorithm converges in much fewer iterations and improves the computational efficiency over the conventional OLS algorithm. Theoretical analysis shows that MOLS (L>1L > 1) performs exact recovery of all KK-sparse signals within KK iterations if the measurement matrix satisfies the restricted isometry property (RIP) with isometry constant δLK<LK+2L.\delta_{LK} < \frac{\sqrt{L}}{\sqrt{K} + 2 \sqrt{L}}. The recovery performance of MOLS in the noisy scenario is also studied. It is shown that stable recovery of sparse signals can be achieved with the MOLS algorithm when the signal-to-noise ratio (SNR) scales linearly with the sparsity level of input signals

    Reliable recovery of hierarchically sparse signals for Gaussian and Kronecker product measurements

    Full text link
    We propose and analyze a solution to the problem of recovering a block sparse signal with sparse blocks from linear measurements. Such problems naturally emerge inter alia in the context of mobile communication, in order to meet the scalability and low complexity requirements of massive antenna systems and massive machine-type communication. We introduce a new variant of the Hard Thresholding Pursuit (HTP) algorithm referred to as HiHTP. We provide both a proof of convergence and a recovery guarantee for noisy Gaussian measurements that exhibit an improved asymptotic scaling in terms of the sampling complexity in comparison with the usual HTP algorithm. Furthermore, hierarchically sparse signals and Kronecker product structured measurements naturally arise together in a variety of applications. We establish the efficient reconstruction of hierarchically sparse signals from Kronecker product measurements using the HiHTP algorithm. Additionally, we provide analytical results that connect our recovery conditions to generalized coherence measures. Again, our recovery results exhibit substantial improvement in the asymptotic sampling complexity scaling over the standard setting. Finally, we validate in numerical experiments that for hierarchically sparse signals, HiHTP performs significantly better compared to HTP.Comment: 11+4 pages, 5 figures. V3: Incomplete funding information corrected and minor typos corrected. V4: Change of title and additional author Axel Flinth. Included new results on Kronecker product measurements and relations of HiRIP to hierarchical coherence measures. Improved presentation of general hierarchically sparse signals and correction of minor typo
    • …
    corecore