21 research outputs found
Greedy-Like Algorithms for the Cosparse Analysis Model
The cosparse analysis model has been introduced recently as an interesting
alternative to the standard sparse synthesis approach. A prominent question
brought up by this new construction is the analysis pursuit problem -- the need
to find a signal belonging to this model, given a set of corrupted measurements
of it. Several pursuit methods have already been proposed based on
relaxation and a greedy approach. In this work we pursue this question further,
and propose a new family of pursuit algorithms for the cosparse analysis model,
mimicking the greedy-like methods -- compressive sampling matching pursuit
(CoSaMP), subspace pursuit (SP), iterative hard thresholding (IHT) and hard
thresholding pursuit (HTP). Assuming the availability of a near optimal
projection scheme that finds the nearest cosparse subspace to any vector, we
provide performance guarantees for these algorithms. Our theoretical study
relies on a restricted isometry property adapted to the context of the cosparse
analysis model. We explore empirically the performance of these algorithms by
adopting a plain thresholding projection, demonstrating their good performance
On the Effective Measure of Dimension in the Analysis Cosparse Model
Many applications have benefited remarkably from low-dimensional models in
the recent decade. The fact that many signals, though high dimensional, are
intrinsically low dimensional has given the possibility to recover them stably
from a relatively small number of their measurements. For example, in
compressed sensing with the standard (synthesis) sparsity prior and in matrix
completion, the number of measurements needed is proportional (up to a
logarithmic factor) to the signal's manifold dimension.
Recently, a new natural low-dimensional signal model has been proposed: the
cosparse analysis prior. In the noiseless case, it is possible to recover
signals from this model, using a combinatorial search, from a number of
measurements proportional to the signal's manifold dimension. However, if we
ask for stability to noise or an efficient (polynomial complexity) solver, all
the existing results demand a number of measurements which is far removed from
the manifold dimension, sometimes far greater. Thus, it is natural to ask
whether this gap is a deficiency of the theory and the solvers, or if there
exists a real barrier in recovering the cosparse signals by relying only on
their manifold dimension. Is there an algorithm which, in the presence of
noise, can accurately recover a cosparse signal from a number of measurements
proportional to the manifold dimension? In this work, we prove that there is no
such algorithm. Further, we show through numerical simulations that even in the
noiseless case convex relaxations fail when the number of measurements is
comparable to the manifold dimension. This gives a practical counter-example to
the growing literature on compressed acquisition of signals based on manifold
dimension.Comment: 19 pages, 6 figure
Can we allow linear dependencies in the dictionary in the sparse synthesis framework?
Signal recovery from a given set of linear measurements using a sparsity
prior has been a major subject of research in recent years. In this model, the
signal is assumed to have a sparse representation under a given dictionary.
Most of the work dealing with this subject has focused on the reconstruction of
the signal's representation as the means for recovering the signal itself. This
approach forced the dictionary to be of low coherence and with no linear
dependencies between its columns. Recently, a series of contributions that
focus on signal recovery using the analysis model find that linear dependencies
in the analysis dictionary are in fact permitted and beneficial. In this paper
we show theoretically that the same holds also for signal recovery in the
synthesis case for the l0- synthesis minimization problem. In addition, we
demonstrate empirically the relevance of our conclusions for recovering the
signal using an l1-relaxation.Comment: 2 figures, to appear in ICASSP 201
Sampling in the Analysis Transform Domain
Many signal and image processing applications have benefited remarkably from
the fact that the underlying signals reside in a low dimensional subspace. One
of the main models for such a low dimensionality is the sparsity one. Within
this framework there are two main options for the sparse modeling: the
synthesis and the analysis ones, where the first is considered the standard
paradigm for which much more research has been dedicated. In it the signals are
assumed to have a sparse representation under a given dictionary. On the other
hand, in the analysis approach the sparsity is measured in the coefficients of
the signal after applying a certain transformation, the analysis dictionary, on
it. Though several algorithms with some theory have been developed for this
framework, they are outnumbered by the ones proposed for the synthesis
methodology.
Given that the analysis dictionary is either a frame or the two dimensional
finite difference operator, we propose a new sampling scheme for signals from
the analysis model that allows to recover them from their samples using any
existing algorithm from the synthesis model. The advantage of this new sampling
strategy is that it makes the existing synthesis methods with their theory also
available for signals from the analysis framework.Comment: 13 Pages, 2 figure
Greedy Signal Space Methods for Incoherence and Beyond
Compressive sampling (CoSa) has provided many methods for signal recovery of signals compressible with respect to an orthonormal basis. However, modern applications have sparked the emergence of approaches for signals not sparse in an orthonormal basis but in some arbitrary, perhaps highly overcomplete, dictionary. Recently, several signal-space greedy methods have been proposed to address signal recovery in this setting. However, such methods inherently rely on the existence of fast and accurate projections which allow one to identify the most relevant atoms in a dictionary for any given signal, up to a very strict accuracy. When the dictionary is highly overcomplete, no such projections are currently known; the requirements on such projections do not even hold for incoherent or well-behaved dictionaries. In this work, we provide an alternate analysis for signal space greedy methods which enforce assumptions on these projections which hold in several settings including those when the dictionary is incoherent or structurally coherent. These results align more closely with traditional results in the standard CoSa literature and improve upon previous work in the signal space setting
Recommended from our members
Sparse Analysis Recovery via Iterative Cosupport Detection Estimation
Cosparse analysis model (CAM) provides a new signal processing paradigm for recovering cosparse signals with respect to a given analysis operator from the undersampled linear measurements in the context of emerging theory of compressed sensing (CS). The sparse analysis recovery/cosparse recovery is a key one brought up by this new paradigm. In this paper, we propose a new family of analysis pursuit algorithms for the sparse analysis recovery problem when the signals obey the cosparse analysis model, termed as iterative cosupport detection estimation (ICDE). ICDE is an algorithmic framework, which alternates between detecting a cosupport set of the unknown true signal and estimating the underlying signal by solving a truncated analysis pursuit problem on the detected cosupport. Further, we propose effective implementations of ICDE equipped with an efficient thresholding strategy for cosupport detection. Empirical performance comparisons show that ICDE is favorable in comparison with the state-of-the-art sparse analysis recovery algorithms. Source code of ICDE has been made publicly available on Github: https://github.com/songhp/ICDE.Beijing Natural Science Foundation (BNSF) under Grant No. 4194076, the Natural Science Foundation of Jiangsu Province under Grant No. BK20170558 and the China Scholarship Council (CSC, No. 202008320094)