6,611 research outputs found
Guarantees of Total Variation Minimization for Signal Recovery
In this paper, we consider using total variation minimization to recover
signals whose gradients have a sparse support, from a small number of
measurements. We establish the proof for the performance guarantee of total
variation (TV) minimization in recovering \emph{one-dimensional} signal with
sparse gradient support. This partially answers the open problem of proving the
fidelity of total variation minimization in such a setting \cite{TVMulti}. In
particular, we have shown that the recoverable gradient sparsity can grow
linearly with the signal dimension when TV minimization is used. Recoverable
sparsity thresholds of TV minimization are explicitly computed for
1-dimensional signal by using the Grassmann angle framework. We also extend our
results to TV minimization for multidimensional signals. Stability of
recovering signal itself using 1-D TV minimization has also been established
through a property called "almost Euclidean property for 1-dimensional TV
norm". We further give a lower bound on the number of random Gaussian
measurements for recovering 1-dimensional signal vectors with elements and
-sparse gradients. Interestingly, the number of needed measurements is lower
bounded by , rather than the bound
frequently appearing in recovering -sparse signal vectors.Comment: lower bounds added; version with Gaussian width, improved bounds;
stability results adde
Stable image reconstruction using total variation minimization
This article presents near-optimal guarantees for accurate and robust image
recovery from under-sampled noisy measurements using total variation
minimization. In particular, we show that from O(slog(N)) nonadaptive linear
measurements, an image can be reconstructed to within the best s-term
approximation of its gradient up to a logarithmic factor, and this factor can
be removed by taking slightly more measurements. Along the way, we prove a
strengthened Sobolev inequality for functions lying in the null space of
suitably incoherent matrices.Comment: 25 page
Near-optimal compressed sensing guarantees for anisotropic and isotropic total variation minimization
Consider the problem of reconstructing a multidimensional signal from partial information, as in the setting of compressed sensing. Without any additional assumptions, this problem is ill-posed. However, for signals such as natural images or movies, the minimal total variation estimate consistent with the measurements often produces a good approximation to the underlying signal, even if the number of measurements is far smaller than the ambient dimensionality. Recently, guarantees for two-dimensional images were established. This paper extends these theoretical results to signals of arbitrary dimension and to both the anisotropic and isotropic total variation problems. To be precise, we show that a multidimensional signal can be reconstructed from a small number of linear measurements using total variation minimization to within a factor of the best approximation of its gradient. The reconstruction guarantees we provide are necessarily optimal up to polynomial factors in the spatial dimension and a logarithmic factor in the signal dimension. The proof relies on bounds in approximation theory concerning the compressibility of wavelet expansions of bounded-variation functions
Near-optimal Compressed Sensing Guarantees for Total Variation Minimization
Consider the problem of reconstructing a multidimensional signal from an underdetermined set of measurements, as in the setting of compressed sensing. Without any additional assumptions, this problem is ill-posed. However, for signals such as natural images or movies, the minimal total variation estimate consistent with the measurements often produces a good approximation to the underlying signal, even if the number of measurements is far smaller than the ambient dimensionality. This paper extends recent reconstruction guarantees for two-dimensional images x ∈ ℂN2 to signals x ∈ ℂNd of arbitrary dimension d ≥ 2 and to isotropic total variation problems. In this paper, we show that a multidimensional signal x ∈ ℂNd can be reconstructed from O(s dlog(Nd)) linear measurements y = Ax using total variation minimization to a factor of the best s-term approximation of its gradient. The reconstruction guarantees we provide are necessarily optimal up to polynomial factors in the spatial dimension d
- …