10,915 research outputs found
Guarantees of Total Variation Minimization for Signal Recovery
In this paper, we consider using total variation minimization to recover
signals whose gradients have a sparse support, from a small number of
measurements. We establish the proof for the performance guarantee of total
variation (TV) minimization in recovering \emph{one-dimensional} signal with
sparse gradient support. This partially answers the open problem of proving the
fidelity of total variation minimization in such a setting \cite{TVMulti}. In
particular, we have shown that the recoverable gradient sparsity can grow
linearly with the signal dimension when TV minimization is used. Recoverable
sparsity thresholds of TV minimization are explicitly computed for
1-dimensional signal by using the Grassmann angle framework. We also extend our
results to TV minimization for multidimensional signals. Stability of
recovering signal itself using 1-D TV minimization has also been established
through a property called "almost Euclidean property for 1-dimensional TV
norm". We further give a lower bound on the number of random Gaussian
measurements for recovering 1-dimensional signal vectors with elements and
-sparse gradients. Interestingly, the number of needed measurements is lower
bounded by , rather than the bound
frequently appearing in recovering -sparse signal vectors.Comment: lower bounds added; version with Gaussian width, improved bounds;
stability results adde
Stable image reconstruction using total variation minimization
This article presents near-optimal guarantees for accurate and robust image
recovery from under-sampled noisy measurements using total variation
minimization. In particular, we show that from O(slog(N)) nonadaptive linear
measurements, an image can be reconstructed to within the best s-term
approximation of its gradient up to a logarithmic factor, and this factor can
be removed by taking slightly more measurements. Along the way, we prove a
strengthened Sobolev inequality for functions lying in the null space of
suitably incoherent matrices.Comment: 25 page
Robust analysis -recovery from Gaussian measurements and total variation minimization
Analysis -recovery refers to a technique of recovering a signal that
is sparse in some transform domain from incomplete corrupted measurements. This
includes total variation minimization as an important special case when the
transform domain is generated by a difference operator. In the present paper we
provide a bound on the number of Gaussian measurements required for successful
recovery for total variation and for the case that the analysis operator is a
frame. The bounds are particularly suitable when the sparsity of the analysis
representation of the signal is not very small
Sampling in the Analysis Transform Domain
Many signal and image processing applications have benefited remarkably from
the fact that the underlying signals reside in a low dimensional subspace. One
of the main models for such a low dimensionality is the sparsity one. Within
this framework there are two main options for the sparse modeling: the
synthesis and the analysis ones, where the first is considered the standard
paradigm for which much more research has been dedicated. In it the signals are
assumed to have a sparse representation under a given dictionary. On the other
hand, in the analysis approach the sparsity is measured in the coefficients of
the signal after applying a certain transformation, the analysis dictionary, on
it. Though several algorithms with some theory have been developed for this
framework, they are outnumbered by the ones proposed for the synthesis
methodology.
Given that the analysis dictionary is either a frame or the two dimensional
finite difference operator, we propose a new sampling scheme for signals from
the analysis model that allows to recover them from their samples using any
existing algorithm from the synthesis model. The advantage of this new sampling
strategy is that it makes the existing synthesis methods with their theory also
available for signals from the analysis framework.Comment: 13 Pages, 2 figure
-Analysis Minimization and Generalized (Co-)Sparsity: When Does Recovery Succeed?
This paper investigates the problem of signal estimation from undersampled
noisy sub-Gaussian measurements under the assumption of a cosparse model. Based
on generalized notions of sparsity, we derive novel recovery guarantees for the
-analysis basis pursuit, enabling highly accurate predictions of its
sample complexity. The corresponding bounds on the number of required
measurements do explicitly depend on the Gram matrix of the analysis operator
and therefore particularly account for its mutual coherence structure. Our
findings defy conventional wisdom which promotes the sparsity of analysis
coefficients as the crucial quantity to study. In fact, this common paradigm
breaks down completely in many situations of practical interest, for instance,
when applying a redundant (multilevel) frame as analysis prior. By extensive
numerical experiments, we demonstrate that, in contrast, our theoretical
sampling-rate bounds reliably capture the recovery capability of various
examples, such as redundant Haar wavelets systems, total variation, or random
frames. The proofs of our main results build upon recent achievements in the
convex geometry of data mining problems. More precisely, we establish a
sophisticated upper bound on the conic Gaussian mean width that is associated
with the underlying -analysis polytope. Due to a novel localization
argument, it turns out that the presented framework naturally extends to stable
recovery, allowing us to incorporate compressible coefficient sequences as
well
- …