3,035 research outputs found

    Guarantees of Total Variation Minimization for Signal Recovery

    Full text link
    In this paper, we consider using total variation minimization to recover signals whose gradients have a sparse support, from a small number of measurements. We establish the proof for the performance guarantee of total variation (TV) minimization in recovering \emph{one-dimensional} signal with sparse gradient support. This partially answers the open problem of proving the fidelity of total variation minimization in such a setting \cite{TVMulti}. In particular, we have shown that the recoverable gradient sparsity can grow linearly with the signal dimension when TV minimization is used. Recoverable sparsity thresholds of TV minimization are explicitly computed for 1-dimensional signal by using the Grassmann angle framework. We also extend our results to TV minimization for multidimensional signals. Stability of recovering signal itself using 1-D TV minimization has also been established through a property called "almost Euclidean property for 1-dimensional TV norm". We further give a lower bound on the number of random Gaussian measurements for recovering 1-dimensional signal vectors with NN elements and KK-sparse gradients. Interestingly, the number of needed measurements is lower bounded by Ω((NK)12)\Omega((NK)^{\frac{1}{2}}), rather than the O(Klog⁑(N/K))O(K\log(N/K)) bound frequently appearing in recovering KK-sparse signal vectors.Comment: lower bounds added; version with Gaussian width, improved bounds; stability results adde

    Precise Phase Transition of Total Variation Minimization

    Full text link
    Characterizing the phase transitions of convex optimizations in recovering structured signals or data is of central importance in compressed sensing, machine learning and statistics. The phase transitions of many convex optimization signal recovery methods such as β„“1\ell_1 minimization and nuclear norm minimization are well understood through recent years' research. However, rigorously characterizing the phase transition of total variation (TV) minimization in recovering sparse-gradient signal is still open. In this paper, we fully characterize the phase transition curve of the TV minimization. Our proof builds on Donoho, Johnstone and Montanari's conjectured phase transition curve for the TV approximate message passing algorithm (AMP), together with the linkage between the minmax Mean Square Error of a denoising problem and the high-dimensional convex geometry for TV minimization.Comment: 6 page

    Asymptotically Sharp Upper Bound for the Column Subset Selection Problem

    Full text link
    This paper investigates the spectral norm version of the column subset selection problem. Given a matrix A∈RnΓ—d\mathbf{A}\in\mathbb{R}^{n\times d} and a positive integer k≀rank(A)k\leq\text{rank}(\mathbf{A}), the objective is to select exactly kk columns of A\mathbf{A} that minimize the spectral norm of the residual matrix after projecting A\mathbf{A} onto the space spanned by the selected columns. We use the method of interlacing polynomials introduced by Marcus-Spielman-Srivastava to derive an asymptotically sharp upper bound on the minimal approximation error, and propose a deterministic polynomial-time algorithm that achieves this error bound (up to a computational error). Furthermore, we extend our result to a column partition problem in which the columns of A\mathbf{A} can be partitioned into rβ‰₯2r\geq 2 subsets such that A\mathbf{A} can be well approximated by subsets from various groups. We show that the machinery of interlacing polynomials also works in this context, and establish a connection between the relevant expected characteristic polynomials and the rr-characteristic polynomials introduced by Ravichandran and Leake. As a consequence, we prove that the columns of a rank-dd matrix A∈RnΓ—d\mathbf{A}\in\mathbb{R}^{n\times d} can be partitioned into rr subsets S1,…SrS_1,\ldots S_r, such that the column space of A\mathbf{A} can be well approximated by the span of the columns in the complement of SiS_i for each 1≀i≀r1\leq i\leq r

    Precise Semidefinite Programming Formulation of Atomic Norm Minimization for Recovering d-Dimensional (dβ‰₯2d\geq 2) Off-the-Grid Frequencies

    Full text link
    Recent research in off-the-grid compressed sensing (CS) has demonstrated that, under certain conditions, one can successfully recover a spectrally sparse signal from a few time-domain samples even though the dictionary is continuous. In particular, atomic norm minimization was proposed in \cite{tang2012csotg} to recover 11-dimensional spectrally sparse signal. However, in spite of existing research efforts \cite{chi2013compressive}, it was still an open problem how to formulate an equivalent positive semidefinite program for atomic norm minimization in recovering signals with dd-dimensional (dβ‰₯2d\geq 2) off-the-grid frequencies. In this paper, we settle this problem by proposing equivalent semidefinite programming formulations of atomic norm minimization to recover signals with dd-dimensional (dβ‰₯2d\geq 2) off-the-grid frequencies.Comment: 4 pages, double-column,1 Figur
    • …
    corecore