16 research outputs found

    Compressed Sensing over the Grassmann Manifold: A Unified Analytical Framework

    Get PDF
    It is well known that compressed sensing problems reduce to finding the sparse solutions for large under-determined systems of equations. Although finding the sparse solutions in general may be computationally difficult, starting with the seminal work of [2], it has been shown that linear programming techniques, obtained from an l_(1)-norm relaxation of the original non-convex problem, can provably find the unknown vector in certain instances. In particular, using a certain restricted isometry property, [2] shows that for measurement matrices chosen from a random Gaussian ensemble, l_1 optimization can find the correct solution with overwhelming probability even when the support size of the unknown vector is proportional to its dimension. The paper [1] uses results on neighborly polytopes from [6] to give a ldquosharprdquo bound on what this proportionality should be in the Gaussian measurement ensemble. In this paper we shall focus on finding sharp bounds on the recovery of ldquoapproximately sparserdquo signals (also possibly under noisy measurements). While the restricted isometry property can be used to study the recovery of approximately sparse signals (and also in the presence of noisy measurements), the obtained bounds can be quite loose. On the other hand, the neighborly polytopes technique which yields sharp bounds for ideally sparse signals cannot be generalized to approximately sparse signals. In this paper, starting from a necessary and sufficient condition for achieving a certain signal recovery accuracy, using high-dimensional geometry, we give a unified null-space Grassmannian angle-based analytical framework for compressive sensing. This new framework gives sharp quantitative tradeoffs between the signal sparsity and the recovery accuracy of the l_1 optimization for approximately sparse signals. As it will turn out, the neighborly polytopes result of [1] for ideally sparse signals can be viewed as a special case of ours. Our result concerns fundamental properties of linear subspaces and so may be of independent mathematical interest

    Neighborly spheres and transversal numbers

    Full text link
    We survey several old and new problems related to the number of simplicial spheres, the number of neighborly simplicial spheres, the number of centrally symmetric simplicial spheres that are cs-neighborly, and the transversal numbers of hypergraphs that arise from simplicial spheres.Comment: 15 pages, for the "Open Problems in Algebraic Combinatorics" AMS volume to accompany the OPAC 2022 conference at the University of Minnesot

    Centrally Symmetric Polytopes with Many Faces.

    Full text link
    We study the convex hull of the symmetric moment curve Uk(t)=(cost,sint,cos3t,sin3t,ldots,cos(2k−1)t,sin(2k−1)t)U_k(t)=(cos t, sin t, cos 3t, sin 3t, ldots, cos (2k-1)t, sin (2k-1)t) in mathbbR2k{mathbb R}^{2k} and provide deterministic constructions of centrally symmetric polytopes with a record high number faces. In particular, we prove that as long as kk distinct points t1,ldots,tkt_1, ldots, t_k lie in an arc of a certain length phik>pi/2phi_k > pi/2, the points Uk(t1),ldots,Uk(tk)U_k(t_1), ldots, U_k(t_k) span a face of the convex hull of Uk(t)U_k(t). Based on this, we obtain deterministic constructions of dd-dimensional centrally symmetric 2-neighborly polytopes with approximately 3d/23^{d/2} vertices. More generally, for a fixed kk, we obtain deterministic constructions of dd-dimensional centrally symmetric kk-neighborly polytopes with exponentially many in dd vertices, and of dd-dimensional centrally symmetric polytopes with an arbitrarily large number of vertices and the density of kk-faces approaching 1 exponentially fast with the dimension.PHDMathematicsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/99877/1/lsjin_1.pd

    On sharp performance bounds for robust sparse signal recoveries

    Get PDF
    It is well known in compressive sensing that l_1 minimization can recover the sparsest solution for a large class of underdetermined systems of linear equations, provided the signal is sufficiently sparse. In this paper, we compute sharp performance bounds for several different notions of robustness in sparse signal recovery via l_1 minimization. In particular, we determine necessary and sufficient conditions for the measurement matrix A under which l_1 minimization guarantees the robustness of sparse signal recovery in the "weak", "sectional" and "strong" (e.g., robustness for "almost all" approximately sparse signals, or instead for "all" approximately sparse signals). Based on these characterizations, we are able to compute sharp performance bounds on the tradeoff between signal sparsity and signal recovery robustness in these various senses. Our results are based on a high-dimensional geometrical analysis of the null-space of the measurement matrix A. These results generalize the thresholds results for purely sparse signals and also present generalized insights on l_1 minimization for recovering purely sparse signals from a null-space perspective

    Triangulations

    Get PDF
    The earliest work in topology was often based on explicit combinatorial models – usually triangulations – for the spaces being studied. Although algebraic methods in topology gradually replaced combinatorial ones in the mid-1900s, the emergence of computers later revitalized the study of triangulations. By now there are several distinct mathematical communities actively doing work on different aspects of triangulations. The goal of this workshop was to bring the researchers from these various communities together to stimulate interaction and to benefit from the exchange of ideas and methods

    Compressive Sensing over the Grassmann Manifold: a Unified Geometric Framework

    Get PDF
    ℓ_1 minimization is often used for finding the sparse solutions of an under-determined linear system. In this paper we focus on finding sharp performance bounds on recovering approximately sparse signals using ℓ_1 minimization, possibly under noisy measurements. While the restricted isometry property is powerful for the analysis of recovering approximately sparse signals with noisy measurements, the known bounds on the achievable sparsity (The "sparsity" in this paper means the size of the set of nonzero or significant elements in a signal vector.) level can be quite loose. The neighborly polytope analysis which yields sharp bounds for ideally sparse signals cannot be readily generalized to approximately sparse signals. Starting from a necessary and sufficient condition, the "balancedness" property of linear subspaces, for achieving a certain signal recovery accuracy, we give a unified null space Grassmann angle-based geometric framework for analyzing the performance of ℓ_1 minimization. By investigating the "balancedness" property, this unified framework characterizes sharp quantitative tradeoffs between the considered sparsity and the recovery accuracy of the ℓ_1 optimization. As a consequence, this generalizes the neighborly polytope result for ideally sparse signals. Besides the robustness in the "strong" sense for all sparse signals, we also discuss the notions of "weak" and "sectional" robustness. Our results concern fundamental properties of linear subspaces and so may be of independent mathematical interest
    corecore