135,569 research outputs found

    Compressed Sensing and Parallel Acquisition

    Full text link
    Parallel acquisition systems arise in various applications in order to moderate problems caused by insufficient measurements in single-sensor systems. These systems allow simultaneous data acquisition in multiple sensors, thus alleviating such problems by providing more overall measurements. In this work we consider the combination of compressed sensing with parallel acquisition. We establish the theoretical improvements of such systems by providing recovery guarantees for which, subject to appropriate conditions, the number of measurements required per sensor decreases linearly with the total number of sensors. Throughout, we consider two different sampling scenarios -- distinct (corresponding to independent sampling in each sensor) and identical (corresponding to dependent sampling between sensors) -- and a general mathematical framework that allows for a wide range of sensing matrices (e.g., subgaussian random matrices, subsampled isometries, random convolutions and random Toeplitz matrices). We also consider not just the standard sparse signal model, but also the so-called sparse in levels signal model. This model includes both sparse and distributed signals and clustered sparse signals. As our results show, optimal recovery guarantees for both distinct and identical sampling are possible under much broader conditions on the so-called sensor profile matrices (which characterize environmental conditions between a source and the sensors) for the sparse in levels model than for the sparse model. To verify our recovery guarantees we provide numerical results showing phase transitions for a number of different multi-sensor environments.Comment: 43 pages, 4 figure

    Improving A*OMP: Theoretical and Empirical Analyses With a Novel Dynamic Cost Model

    Full text link
    Best-first search has been recently utilized for compressed sensing (CS) by the A* orthogonal matching pursuit (A*OMP) algorithm. In this work, we concentrate on theoretical and empirical analyses of A*OMP. We present a restricted isometry property (RIP) based general condition for exact recovery of sparse signals via A*OMP. In addition, we develop online guarantees which promise improved recovery performance with the residue-based termination instead of the sparsity-based one. We demonstrate the recovery capabilities of A*OMP with extensive recovery simulations using the adaptive-multiplicative (AMul) cost model, which effectively compensates for the path length differences in the search tree. The presented results, involving phase transitions for different nonzero element distributions as well as recovery rates and average error, reveal not only the superior recovery accuracy of A*OMP, but also the improvements with the residue-based termination and the AMul cost model. Comparison of the run times indicate the speed up by the AMul cost model. We also demonstrate a hybrid of OMP and A?OMP to accelerate the search further. Finally, we run A*OMP on a sparse image to illustrate its recovery performance for more realistic coefcient distributions

    Exploring Algorithmic Limits of Matrix Rank Minimization under Affine Constraints

    Full text link
    Many applications require recovering a matrix of minimal rank within an affine constraint set, with matrix completion a notable special case. Because the problem is NP-hard in general, it is common to replace the matrix rank with the nuclear norm, which acts as a convenient convex surrogate. While elegant theoretical conditions elucidate when this replacement is likely to be successful, they are highly restrictive and convex algorithms fail when the ambient rank is too high or when the constraint set is poorly structured. Non-convex alternatives fare somewhat better when carefully tuned; however, convergence to locally optimal solutions remains a continuing source of failure. Against this backdrop we derive a deceptively simple and parameter-free probabilistic PCA-like algorithm that is capable, over a wide battery of empirical tests, of successful recovery even at the theoretical limit where the number of measurements equal the degrees of freedom in the unknown low-rank matrix. Somewhat surprisingly, this is possible even when the affine constraint set is highly ill-conditioned. While proving general recovery guarantees remains evasive for non-convex algorithms, Bayesian-inspired or otherwise, we nonetheless show conditions whereby the underlying cost function has a unique stationary point located at the global optimum; no existing cost function we are aware of satisfies this same property. We conclude with a simple computer vision application involving image rectification and a standard collaborative filtering benchmark

    Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit

    Get PDF
    This paper demonstrates theoretically and empirically that a greedy algorithm called Orthogonal Matching Pursuit (OMP) can reliably recover a signal with mm nonzero entries in dimension dd given rmO(mlnd) {rm O}(m ln d) random linear measurements of that signal. This is a massive improvement over previous results, which require rmO(m2){rm O}(m^{2}) measurements. The new results for OMP are comparable with recent results for another approach called Basis Pursuit (BP). In some settings, the OMP algorithm is faster and easier to implement, so it is an attractive alternative to BP for signal recovery problems
    • …
    corecore