2,881 research outputs found

    Distributed soft thresholding for sparse signal recovery

    Get PDF
    In this paper, we address the problem of distributed sparse recovery of signals acquired via compressed measurements in a sensor network. We propose a new class of distributed algorithms to solve Lasso regression problems, when the communication to a fusion center is not possible, e.g., due to communication cost or privacy reasons. More precisely, we introduce a distributed iterative soft thresholding algorithm (DISTA) that consists of three steps: an averaging step, a gradient step, and a soft thresholding operation. We prove the convergence of DISTA in networks represented by regular graphs, and we compare it with existing methods in terms of performance, memory, and complexity.Comment: Revised version. Main improvements: extension of the convergence theorem to regular graphs; new numerical results and comparisons with other algorithm

    Sparse Solution of Underdetermined Linear Equations via Adaptively Iterative Thresholding

    Full text link
    Finding the sparset solution of an underdetermined system of linear equations y=Axy=Ax has attracted considerable attention in recent years. Among a large number of algorithms, iterative thresholding algorithms are recognized as one of the most efficient and important classes of algorithms. This is mainly due to their low computational complexities, especially for large scale applications. The aim of this paper is to provide guarantees on the global convergence of a wide class of iterative thresholding algorithms. Since the thresholds of the considered algorithms are set adaptively at each iteration, we call them adaptively iterative thresholding (AIT) algorithms. As the main result, we show that as long as AA satisfies a certain coherence property, AIT algorithms can find the correct support set within finite iterations, and then converge to the original sparse solution exponentially fast once the correct support set has been identified. Meanwhile, we also demonstrate that AIT algorithms are robust to the algorithmic parameters. In addition, it should be pointed out that most of the existing iterative thresholding algorithms such as hard, soft, half and smoothly clipped absolute deviation (SCAD) algorithms are included in the class of AIT algorithms studied in this paper.Comment: 33 pages, 1 figur

    Proceedings of the second "international Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST'14)

    Get PDF
    The implicit objective of the biennial "international - Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST) is to foster collaboration between international scientific teams by disseminating ideas through both specific oral/poster presentations and free discussions. For its second edition, the iTWIST workshop took place in the medieval and picturesque town of Namur in Belgium, from Wednesday August 27th till Friday August 29th, 2014. The workshop was conveniently located in "The Arsenal" building within walking distance of both hotels and town center. iTWIST'14 has gathered about 70 international participants and has featured 9 invited talks, 10 oral presentations, and 14 posters on the following themes, all related to the theory, application and generalization of the "sparsity paradigm": Sparsity-driven data sensing and processing; Union of low dimensional subspaces; Beyond linear and convex inverse problem; Matrix/manifold/graph sensing/processing; Blind inverse problems and dictionary learning; Sparsity and computational neuroscience; Information theory, geometry and randomness; Complexity/accuracy tradeoffs in numerical methods; Sparsity? What's next?; Sparse machine learning and inference.Comment: 69 pages, 24 extended abstracts, iTWIST'14 website: http://sites.google.com/site/itwist1

    Maximin Analysis of Message Passing Algorithms for Recovering Block Sparse Signals

    Full text link
    We consider the problem of recovering a block (or group) sparse signal from an underdetermined set of random linear measurements, which appear in compressed sensing applications such as radar and imaging. Recent results of Donoho, Johnstone, and Montanari have shown that approximate message passing (AMP) in combination with Stein's shrinkage outperforms group LASSO for large block sizes. In this paper, we prove that, for a fixed block size and in the strong undersampling regime (i.e., having very few measurements compared to the ambient dimension), AMP cannot improve upon group LASSO, thereby complementing the results of Donoho et al
    • …
    corecore